8119 1726773005.06889: starting run ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles 8119 1726773005.17635: Added group all to inventory 8119 1726773005.17638: Added group ungrouped to inventory 8119 1726773005.17642: Group all now contains ungrouped 8119 1726773005.17646: Examining possible inventory source: /tmp/kernel_settings-Xvl/inventory.yml 8119 1726773005.17764: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/cache 8119 1726773005.17821: Loading CacheModule 'memory' from /usr/local/lib/python3.9/site-packages/ansible/plugins/cache/memory.py 8119 1726773005.17845: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory 8119 1726773005.17910: Loading InventoryModule 'host_list' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/host_list.py 8119 1726773005.17984: Loaded config def from plugin (inventory/script) 8119 1726773005.17988: Loading InventoryModule 'script' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/script.py 8119 1726773005.18019: Loading InventoryModule 'auto' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/auto.py 8119 1726773005.18073: Loaded config def from plugin (inventory/yaml) 8119 1726773005.18077: Loading InventoryModule 'yaml' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/yaml.py 8119 1726773005.18134: Loading InventoryModule 'ini' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/ini.py 8119 1726773005.18185: Loading InventoryModule 'toml' from /usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/toml.py 8119 1726773005.18190: Attempting to use plugin host_list (/usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/host_list.py) 8119 1726773005.18193: Attempting to use plugin script (/usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/script.py) 8119 1726773005.18198: Attempting to use plugin auto (/usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/auto.py) 8119 1726773005.18202: Loading data from /tmp/kernel_settings-Xvl/inventory.yml 8119 1726773005.18254: /tmp/kernel_settings-Xvl/inventory.yml was not parsable by auto 8119 1726773005.18284: Attempting to use plugin yaml (/usr/local/lib/python3.9/site-packages/ansible/plugins/inventory/yaml.py) 8119 1726773005.18356: Loading data from /tmp/kernel_settings-Xvl/inventory.yml 8119 1726773005.18411: group all already in inventory 8119 1726773005.18419: set inventory_file for managed_node1 8119 1726773005.18424: set inventory_dir for managed_node1 8119 1726773005.18425: Added host managed_node1 to inventory 8119 1726773005.18428: Added host managed_node1 to group all 8119 1726773005.18429: set ansible_host for managed_node1 8119 1726773005.18430: set ansible_ssh_extra_args for managed_node1 8119 1726773005.18433: set inventory_file for managed_node2 8119 1726773005.18436: set inventory_dir for managed_node2 8119 1726773005.18437: Added host managed_node2 to inventory 8119 1726773005.18439: Added host managed_node2 to group all 8119 1726773005.18440: set ansible_host for managed_node2 8119 1726773005.18441: set ansible_ssh_extra_args for managed_node2 8119 1726773005.18444: set inventory_file for managed_node3 8119 1726773005.18446: set inventory_dir for managed_node3 8119 1726773005.18448: Added host managed_node3 to inventory 8119 1726773005.18449: Added host managed_node3 to group all 8119 1726773005.18451: set ansible_host for managed_node3 8119 1726773005.18452: set ansible_ssh_extra_args for managed_node3 8119 1726773005.18454: Reconcile groups and hosts in inventory. 8119 1726773005.18458: Group ungrouped now contains managed_node1 8119 1726773005.18460: Group ungrouped now contains managed_node2 8119 1726773005.18462: Group ungrouped now contains managed_node3 8119 1726773005.18470: Loading CacheModule 'memory' from /usr/local/lib/python3.9/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8119 1726773005.19631: Loaded config def from plugin (connection/buildah) 8119 1726773005.19641: Loading Connection 'buildah' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/buildah.py (found_in_cache=False, class_only=True) 8119 1726773005.19717: Loaded config def from plugin (connection/chroot) 8119 1726773005.19723: Loading Connection 'chroot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/chroot.py (found_in_cache=False, class_only=True) 8119 1726773005.19892: Loaded config def from plugin (connection/docker) 8119 1726773005.19898: Loading Connection 'docker' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/docker.py (found_in_cache=False, class_only=True) 8119 1726773005.19958: Loaded config def from plugin (connection/funcd) 8119 1726773005.23458: Loaded config def from plugin (connection/httpapi) 8119 1726773005.23467: Loading Connection 'httpapi' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/httpapi.py (found_in_cache=False, class_only=True) 8119 1726773005.23548: Loaded config def from plugin (connection/iocage) 8119 1726773005.23554: Loading Connection 'iocage' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/iocage.py (found_in_cache=False, class_only=True) 8119 1726773005.23585: Loaded config def from plugin (connection/jail) 8119 1726773005.23589: Loading Connection 'jail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/jail.py (found_in_cache=False, class_only=True) 8119 1726773005.23778: Loaded config def from plugin (connection/kubectl) 8119 1726773005.23786: Loading Connection 'kubectl' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/kubectl.py (found_in_cache=False, class_only=True) 8119 1726773005.23828: Loaded config def from plugin (connection/libvirt_lxc) 8119 1726773005.23833: Loading Connection 'libvirt_lxc' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/libvirt_lxc.py (found_in_cache=False, class_only=True) 8119 1726773005.23864: Loading Connection 'local' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8119 1726773005.23927: Loaded config def from plugin (connection/lxc) 8119 1726773005.23932: Loading Connection 'lxc' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/lxc.py (found_in_cache=False, class_only=True) 8119 1726773005.23975: Loaded config def from plugin (connection/lxd) 8119 1726773005.23979: Loading Connection 'lxd' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/lxd.py (found_in_cache=False, class_only=True) 8119 1726773005.24178: Loaded config def from plugin (connection/napalm) 8119 1726773005.24187: Loading Connection 'napalm' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/napalm.py (found_in_cache=False, class_only=True) 8119 1726773005.24527: Loaded config def from plugin (connection/netconf) 8119 1726773005.24534: Loading Connection 'netconf' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/netconf.py (found_in_cache=False, class_only=True) 8119 1726773005.25100: Loaded config def from plugin (connection/network_cli) 8119 1726773005.25107: Loading Connection 'network_cli' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/network_cli.py (found_in_cache=False, class_only=True) 8119 1726773005.25468: Loaded config def from plugin (connection/oc) 8119 1726773005.25475: Loading Connection 'oc' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/oc.py (found_in_cache=False, class_only=True) 8119 1726773005.25791: Loaded config def from plugin (connection/paramiko_ssh) 8119 1726773005.25798: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8119 1726773005.28400: Loaded config def from plugin (connection/persistent) 8119 1726773005.28409: Loading Connection 'persistent' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/persistent.py (found_in_cache=False, class_only=True) 8119 1726773005.28493: Loaded config def from plugin (connection/podman) 8119 1726773005.28501: Loading Connection 'podman' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/podman.py (found_in_cache=False, class_only=True) 8119 1726773005.29659: Loaded config def from plugin (connection/psrp) 8119 1726773005.29668: Loading Connection 'psrp' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8119 1726773005.29744: Loaded config def from plugin (connection/qubes) 8119 1726773005.29752: Loading Connection 'qubes' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/qubes.py (found_in_cache=False, class_only=True) 8119 1726773005.29819: Loading Connection 'saltstack' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/saltstack.py (found_in_cache=False, class_only=True) 8119 1726773005.30375: Loaded config def from plugin (connection/ssh) 8119 1726773005.30382: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8119 1726773005.30712: Loaded config def from plugin (connection/vmware_tools) 8119 1726773005.30720: Loading Connection 'vmware_tools' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/vmware_tools.py (found_in_cache=False, class_only=True) 8119 1726773005.31462: Loaded config def from plugin (connection/winrm) 8119 1726773005.31470: Loading Connection 'winrm' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8119 1726773005.31540: Loaded config def from plugin (connection/zone) 8119 1726773005.31547: Loading Connection 'zone' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/zone.py (found_in_cache=False, class_only=True) 8119 1726773005.31725: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments 8119 1726773005.31933: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8119 1726773005.32005: Loaded config def from plugin (shell/cmd) 8119 1726773005.32011: Loading ShellModule 'cmd' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8119 1726773005.32068: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/shell_common.py 8119 1726773005.32196: Loaded config def from plugin (shell/csh) 8119 1726773005.32202: Loading ShellModule 'csh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/csh.py (found_in_cache=False, class_only=True) 8119 1726773005.32267: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/shell_common.py (found_in_cache=True, class_only=False) 8119 1726773005.32384: Loaded config def from plugin (shell/fish) 8119 1726773005.32390: Loading ShellModule 'fish' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/fish.py (found_in_cache=False, class_only=True) 8119 1726773005.32414: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8119 1726773005.32475: Loaded config def from plugin (shell/powershell) 8119 1726773005.32481: Loading ShellModule 'powershell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8119 1726773005.32510: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/shell_common.py (found_in_cache=True, class_only=False) 8119 1726773005.32629: Loaded config def from plugin (shell/sh) 8119 1726773005.32635: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8119 1726773005.32900: Loaded config def from plugin (become/doas) 8119 1726773005.32906: Loading BecomeModule 'doas' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/doas.py (found_in_cache=False, class_only=True) 8119 1726773005.33062: Loaded config def from plugin (become/dzdo) 8119 1726773005.33068: Loading BecomeModule 'dzdo' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/dzdo.py (found_in_cache=False, class_only=True) 8119 1726773005.33138: Loaded config def from plugin (become/enable) 8119 1726773005.33144: Loading BecomeModule 'enable' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/enable.py (found_in_cache=False, class_only=True) 8119 1726773005.33329: Loaded config def from plugin (become/ksu) 8119 1726773005.33335: Loading BecomeModule 'ksu' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/ksu.py (found_in_cache=False, class_only=True) 8119 1726773005.33493: Loaded config def from plugin (become/machinectl) 8119 1726773005.33499: Loading BecomeModule 'machinectl' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/machinectl.py (found_in_cache=False, class_only=True) 8119 1726773005.33686: Loaded config def from plugin (become/pbrun) 8119 1726773005.33692: Loading BecomeModule 'pbrun' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/pbrun.py (found_in_cache=False, class_only=True) 8119 1726773005.33878: Loaded config def from plugin (become/pfexec) 8119 1726773005.33886: Loading BecomeModule 'pfexec' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/pfexec.py (found_in_cache=False, class_only=True) 8119 1726773005.34011: Loaded config def from plugin (become/pmrun) 8119 1726773005.34017: Loading BecomeModule 'pmrun' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/pmrun.py (found_in_cache=False, class_only=True) 8119 1726773005.34149: Loaded config def from plugin (become/runas) 8119 1726773005.34155: Loading BecomeModule 'runas' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8119 1726773005.34343: Loaded config def from plugin (become/sesu) 8119 1726773005.34350: Loading BecomeModule 'sesu' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/sesu.py (found_in_cache=False, class_only=True) 8119 1726773005.34539: Loaded config def from plugin (become/su) 8119 1726773005.34545: Loading BecomeModule 'su' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8119 1726773005.34707: Loaded config def from plugin (become/sudo) 8119 1726773005.34714: Loading BecomeModule 'sudo' from /usr/local/lib/python3.9/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) 8119 1726773005.34744: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8119 1726773005.36151: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/action 8119 1726773005.36286: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__ 8119 1726773005.36376: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773005.38334: trying /usr/local/lib/python3.9/site-packages/ansible/modules 8119 1726773005.38357: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud 8119 1726773005.38393: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering 8119 1726773005.38413: trying /usr/local/lib/python3.9/site-packages/ansible/modules/commands 8119 1726773005.38430: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto 8119 1726773005.38458: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database 8119 1726773005.38470: trying /usr/local/lib/python3.9/site-packages/ansible/modules/files 8119 1726773005.38654: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773005.38760: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773005.40100: in VariableManager get_vars() 8119 1726773005.40243: done with get_vars() 8119 1726773005.40322: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/callback 8119 1726773005.40527: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py 8119 1726773005.40658: Loaded config def from plugin (callback/debug) 8119 1726773005.40664: Loading CallbackModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/debug.py 8119 1726773005.40820: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.40949: Loaded config def from plugin (callback/actionable) 8119 1726773005.40955: Loading CallbackModule 'actionable' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/actionable.py (found_in_cache=False, class_only=True) 8119 1726773005.40999: Loading CallbackModule 'aws_resource_actions' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/aws_resource_actions.py (found_in_cache=False, class_only=True) 8119 1726773005.41085: Loaded config def from plugin (callback/cgroup_memory_recap) 8119 1726773005.41092: Loading CallbackModule 'cgroup_memory_recap' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/cgroup_memory_recap.py (found_in_cache=False, class_only=True) 8119 1726773005.41494: Loaded config def from plugin (callback/cgroup_perf_recap) 8119 1726773005.41500: Loading CallbackModule 'cgroup_perf_recap' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/cgroup_perf_recap.py (found_in_cache=False, class_only=True) 8119 1726773005.41544: Loading CallbackModule 'context_demo' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/context_demo.py (found_in_cache=False, class_only=True) 8119 1726773005.41602: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.41727: Loaded config def from plugin (callback/counter_enabled) 8119 1726773005.41732: Loading CallbackModule 'counter_enabled' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/counter_enabled.py (found_in_cache=False, class_only=True) 8119 1726773005.41737: Loading CallbackModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/debug.py (found_in_cache=False, class_only=True) 8119 1726773005.41789: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.41915: Loaded config def from plugin (callback/default) 8119 1726773005.41921: Loading CallbackModule 'default' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/default.py (found_in_cache=False, class_only=True) 8119 1726773005.41989: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.42122: Loaded config def from plugin (callback/dense) 8119 1726773005.42128: Loading CallbackModule 'dense' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/dense.py (found_in_cache=False, class_only=True) 8119 1726773005.42278: Loaded config def from plugin (callback/foreman) 8119 1726773005.42285: Loading CallbackModule 'foreman' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/foreman.py (found_in_cache=False, class_only=True) 8119 1726773005.42340: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.42467: Loaded config def from plugin (callback/full_skip) 8119 1726773005.42473: Loading CallbackModule 'full_skip' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/full_skip.py (found_in_cache=False, class_only=True) 8119 1726773005.42660: Loaded config def from plugin (callback/grafana_annotations) 8119 1726773005.42667: Loading CallbackModule 'grafana_annotations' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/grafana_annotations.py (found_in_cache=False, class_only=True) 8119 1726773005.42811: Loaded config def from plugin (callback/hipchat) 8119 1726773005.42816: Loading CallbackModule 'hipchat' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/hipchat.py (found_in_cache=False, class_only=True) 8119 1726773005.42905: Loaded config def from plugin (callback/jabber) 8119 1726773005.42912: Loading CallbackModule 'jabber' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/jabber.py (found_in_cache=False, class_only=True) 8119 1726773005.42993: Loaded config def from plugin (callback/json) 8119 1726773005.42999: Loading CallbackModule 'json' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/json.py (found_in_cache=False, class_only=True) 8119 1726773005.43193: Loaded config def from plugin (callback/junit) 8119 1726773005.43200: Loading CallbackModule 'junit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/junit.py (found_in_cache=False, class_only=True) 8119 1726773005.43268: Loaded config def from plugin (callback/log_plays) 8119 1726773005.43273: Loading CallbackModule 'log_plays' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/log_plays.py (found_in_cache=False, class_only=True) 8119 1726773005.43427: Loaded config def from plugin (callback/logdna) 8119 1726773005.43433: Loading CallbackModule 'logdna' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/logdna.py (found_in_cache=False, class_only=True) 8119 1726773005.43636: Loaded config def from plugin (callback/logentries) 8119 1726773005.43643: Loading CallbackModule 'logentries' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/logentries.py (found_in_cache=False, class_only=True) 8119 1726773005.43738: Loaded config def from plugin (callback/logstash) 8119 1726773005.43744: Loading CallbackModule 'logstash' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/logstash.py (found_in_cache=False, class_only=True) 8119 1726773005.43993: Loaded config def from plugin (callback/mail) 8119 1726773005.43997: Loading CallbackModule 'mail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/mail.py (found_in_cache=False, class_only=True) 8119 1726773005.44036: Loading CallbackModule 'minimal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/minimal.py (found_in_cache=False, class_only=True) 8119 1726773005.44168: Loaded config def from plugin (callback/nrdp) 8119 1726773005.44173: Loading CallbackModule 'nrdp' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/nrdp.py (found_in_cache=False, class_only=True) 8119 1726773005.44210: Loading CallbackModule 'null' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/null.py (found_in_cache=False, class_only=True) 8119 1726773005.44247: Loading CallbackModule 'oneline' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/oneline.py (found_in_cache=False, class_only=True) 8119 1726773005.44292: Loading CallbackModule 'osx_say' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/osx_say.py (found_in_cache=False, class_only=True) 8119 1726773005.44334: Loading CallbackModule 'profile_roles' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/profile_roles.py (found_in_cache=False, class_only=True) 8119 1726773005.44415: Loaded config def from plugin (callback/profile_tasks) 8119 1726773005.44420: Loading CallbackModule 'profile_tasks' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/profile_tasks.py (found_in_cache=False, class_only=True) 8119 1726773005.44464: Loading CallbackModule 'say' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/say.py (found_in_cache=False, class_only=True) 8119 1726773005.44538: Loaded config def from plugin (callback/selective) 8119 1726773005.44543: Loading CallbackModule 'selective' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/selective.py (found_in_cache=False, class_only=True) 8119 1726773005.44592: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.44746: Loaded config def from plugin (callback/skippy) 8119 1726773005.44751: Loading CallbackModule 'skippy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/skippy.py (found_in_cache=False, class_only=True) 8119 1726773005.44872: Loaded config def from plugin (callback/slack) 8119 1726773005.44877: Loading CallbackModule 'slack' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/slack.py (found_in_cache=False, class_only=True) 8119 1726773005.44957: Loaded config def from plugin (callback/splunk) 8119 1726773005.44963: Loading CallbackModule 'splunk' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/splunk.py (found_in_cache=False, class_only=True) 8119 1726773005.45019: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.45151: Loaded config def from plugin (callback/stderr) 8119 1726773005.45157: Loading CallbackModule 'stderr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/stderr.py (found_in_cache=False, class_only=True) 8119 1726773005.45223: Loaded config def from plugin (callback/sumologic) 8119 1726773005.45229: Loading CallbackModule 'sumologic' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/sumologic.py (found_in_cache=False, class_only=True) 8119 1726773005.45422: Loaded config def from plugin (callback/syslog_json) 8119 1726773005.45428: Loading CallbackModule 'syslog_json' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/syslog_json.py (found_in_cache=False, class_only=True) 8119 1726773005.45470: Loading CallbackModule 'timer' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/timer.py (found_in_cache=False, class_only=True) 8119 1726773005.45518: Loading CallbackModule 'tree' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/tree.py (found_in_cache=False, class_only=True) 8119 1726773005.45575: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.45706: Loaded config def from plugin (callback/unixy) 8119 1726773005.45712: Loading CallbackModule 'unixy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/unixy.py (found_in_cache=False, class_only=True) 8119 1726773005.45790: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.9/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8119 1726773005.45919: Loaded config def from plugin (callback/yaml) 8119 1726773005.45925: Loading CallbackModule 'yaml' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/yaml.py (found_in_cache=False, class_only=True) 8119 1726773005.45931: Loading CallbackModule 'profile_tasks' from /usr/local/lib/python3.9/site-packages/ansible/plugins/callback/profile_tasks.py (found_in_cache=True, class_only=True) Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_change_settings.yml ******************************************** 1 plays in /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8119 1726773005.46012: in VariableManager get_vars() 8119 1726773005.46037: done with get_vars() 8119 1726773005.46048: in VariableManager get_vars() 8119 1726773005.46065: done with get_vars() 8119 1726773005.46091: in VariableManager get_vars() 8119 1726773005.46115: done with get_vars() PLAY [Test changing settings] ************************************************** 8119 1726773005.46813: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/strategy 8119 1726773005.46859: Loading StrategyModule 'linear' from /usr/local/lib/python3.9/site-packages/ansible/plugins/strategy/linear.py 8119 1726773005.46902: getting the remaining hosts for this loop 8119 1726773005.46907: done getting the remaining hosts for this loop 8119 1726773005.46914: building list of next tasks for hosts 8119 1726773005.46917: getting the next task for host managed_node2 8119 1726773005.46922: done getting next task for host managed_node2 8119 1726773005.46925: ^ task is: TASK: Gathering Facts 8119 1726773005.46928: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, run_state=ITERATING_SETUP, fail_state=FAILED_NONE, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773005.46930: done building task lists 8119 1726773005.46932: counting tasks in each state of execution 8119 1726773005.46935: done counting tasks in each state of execution: num_setups: 1 num_tasks: 0 num_rescue: 0 num_always: 0 8119 1726773005.46938: advancing hosts in ITERATING_SETUP 8119 1726773005.46940: starting to advance hosts 8119 1726773005.46942: getting the next task for host managed_node2 8119 1726773005.46945: done getting next task for host managed_node2 8119 1726773005.46948: ^ task is: TASK: Gathering Facts 8119 1726773005.46950: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, run_state=ITERATING_SETUP, fail_state=FAILED_NONE, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773005.46952: done advancing hosts to next task 8119 1726773005.46986: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773005.46992: getting variables 8119 1726773005.46995: in VariableManager get_vars() 8119 1726773005.47011: Calling all_inventory to load vars for managed_node2 8119 1726773005.47016: Calling groups_inventory to load vars for managed_node2 8119 1726773005.47020: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773005.47125: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py 8119 1726773005.47139: Calling all_plugins_play to load vars for managed_node2 8119 1726773005.47158: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773005.47173: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773005.47194: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773005.47205: Calling groups_plugins_play to load vars for managed_node2 8119 1726773005.47222: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773005.47251: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773005.47275: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773005.47556: done with get_vars() 8119 1726773005.47567: done getting variables 8119 1726773005.47572: sending task start callback, copying the task so we can template it temporarily 8119 1726773005.47575: done copying, going to template now 8119 1726773005.47578: done templating 8119 1726773005.47580: here goes the callback... TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 Thursday 19 September 2024 15:10:05 -0400 (0:00:00.032) 0:00:00.032 **** 8119 1726773005.47599: sending task start callback 8119 1726773005.47602: entering _queue_task() for managed_node2/gather_facts 8119 1726773005.47605: Creating lock for gather_facts 8119 1726773005.47799: worker is 1 (out of 1 available) 8119 1726773005.47833: exiting _queue_task() for managed_node2/gather_facts 8119 1726773005.47901: done queuing things up, now waiting for results queue to drain 8119 1726773005.47906: waiting for pending results... 8129 1726773005.48096: running TaskExecutor() for managed_node2/TASK: Gathering Facts 8129 1726773005.48146: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000033 8129 1726773005.48196: calling self._execute() 8129 1726773005.48379: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8129 1726773005.48436: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8129 1726773005.48451: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8129 1726773005.48467: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8129 1726773005.48477: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8129 1726773005.48637: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8129 1726773005.48651: starting attempt loop 8129 1726773005.48654: running the handler 8129 1726773005.48675: _low_level_execute_command(): starting 8129 1726773005.48682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8129 1726773005.51525: stderr chunk (state=2): >>>Warning: Permanently added '10.31.8.150' (ECDSA) to the list of known hosts. <<< 8129 1726773005.64793: stdout chunk (state=3): >>>/root <<< 8129 1726773005.64911: stderr chunk (state=3): >>><<< 8129 1726773005.64920: stdout chunk (state=3): >>><<< 8129 1726773005.64949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.8.150' (ECDSA) to the list of known hosts. 8129 1726773005.64968: _low_level_execute_command(): starting 8129 1726773005.64975: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810 `" && echo ansible-tmp-1726773005.6495938-8129-38635041727810="` echo /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810 `" ) && sleep 0' 8129 1726773005.67949: stdout chunk (state=2): >>>ansible-tmp-1726773005.6495938-8129-38635041727810=/root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810 <<< 8129 1726773005.68337: stderr chunk (state=3): >>><<< 8129 1726773005.68346: stdout chunk (state=3): >>><<< 8129 1726773005.68371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773005.6495938-8129-38635041727810=/root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810 , stderr= 8129 1726773005.68478: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity 8129 1726773005.68499: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory 8129 1726773005.68509: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging 8129 1726773005.68515: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring 8129 1726773005.68565: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools 8129 1726773005.68605: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network 8129 1726773005.68664: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification 8129 1726773005.68720: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging 8129 1726773005.68730: trying /usr/local/lib/python3.9/site-packages/ansible/modules/remote_management 8129 1726773005.68748: trying /usr/local/lib/python3.9/site-packages/ansible/modules/source_control 8129 1726773005.68790: trying /usr/local/lib/python3.9/site-packages/ansible/modules/storage 8129 1726773005.68805: trying /usr/local/lib/python3.9/site-packages/ansible/modules/system 8129 1726773005.68978: ANSIBALLZ: Using lock for setup 8129 1726773005.68985: ANSIBALLZ: Acquiring lock 8129 1726773005.68990: ANSIBALLZ: Lock acquired: 140408693992752 8129 1726773005.68994: ANSIBALLZ: Creating module 8129 1726773005.90578: ANSIBALLZ: Writing module into payload 8129 1726773005.90697: ANSIBALLZ: Writing module 8129 1726773005.90717: ANSIBALLZ: Renaming module 8129 1726773005.90721: ANSIBALLZ: Done creating module 8129 1726773005.90737: _low_level_execute_command(): starting 8129 1726773005.90744: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'python3.5'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'python2.6'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8129 1726773005.93897: stdout chunk (state=2): >>>PLATFORM Linux FOUND /usr/bin/python3.6 /usr/libexec/platform-python <<< 8129 1726773005.93930: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 8129 1726773005.94148: stderr chunk (state=3): >>><<< 8129 1726773005.94155: stdout chunk (state=3): >>><<< 8129 1726773005.94187: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.6 /usr/libexec/platform-python /usr/bin/python3 ENDFOUND , stderr= 8129 1726773005.94199 [managed_node2]: found interpreters: ['/usr/bin/python3.6', '/usr/libexec/platform-python', '/usr/bin/python3'] 8129 1726773005.94260: _low_level_execute_command(): starting 8129 1726773005.94271: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.6 && sleep 0' 8129 1726773005.95708: Sending initial data 8129 1726773005.95722: Sent initial data (1234 bytes) 8129 1726773006.00336: stdout chunk (state=3): >>>{"platform_dist_result": ["centos", "8", ""], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8129 1726773006.00736: stderr chunk (state=3): >>><<< 8129 1726773006.00741: stdout chunk (state=3): >>><<< 8129 1726773006.00764: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": ["centos", "8", ""], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8129 1726773006.00830: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/setup-ZIP_DEFLATED 8129 1726773006.00902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/AnsiballZ_setup.py 8129 1726773006.01209: Sending initial data 8129 1726773006.01224: Sent initial data (151 bytes) 8129 1726773006.03835: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpp_xk46cf /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/AnsiballZ_setup.py <<< 8129 1726773006.05642: stderr chunk (state=3): >>><<< 8129 1726773006.05650: stdout chunk (state=3): >>><<< 8129 1726773006.05682: done transferring module to remote 8129 1726773006.05706: _low_level_execute_command(): starting 8129 1726773006.05713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/ /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/AnsiballZ_setup.py && sleep 0' 8129 1726773006.08452: stderr chunk (state=2): >>><<< 8129 1726773006.08465: stdout chunk (state=2): >>><<< 8129 1726773006.08486: _low_level_execute_command() done: rc=0, stdout=, stderr= 8129 1726773006.08491: _low_level_execute_command(): starting 8129 1726773006.08497: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/AnsiballZ_setup.py && sleep 0' 8129 1726773006.87641: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "NA", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-8-150", "ansible_nodename": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "c3e1d9fa8b684f3abae9fbe37bdf8cdf", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "06", "epoch": "1726773006", "date": "2024-09-19", "time": "15:10:06", "iso8601_micro": "2024-09-19T19:10:06.626617Z", "iso8601": "2024-09-19T19:10:06Z", "iso8601_basic": "20240919T151006626617", "iso8601_basic_short": "20240919T151006", "tz": "<<< 8129 1726773006.88005: stdout chunk (state=3): >>>EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALFZBDdfqkngrfgrSSO/kOko3izToWrd6nOpiEjdLVtcXzovVx/6erjnSGabbSOWK8kW9F5xCwoGapRqymVUpYt2iP3WjMoraiZe76U/z6uZzecAsA1RHY0a1aNyRnOVuT0MwGPdOrPG0IUEHdcjNFPIyIR7a1a7jQuuK/VUomILAAAAFQCa+HW/TznpkY1Sjf2CFciRg0F35QAAAIBWyN49dx0xdzzoLQwc1selHyVtecBGccevTf1vXKodpJhjh6cK8qY+3hXglj10iG4E6TtsmyqPol1hCZfFivhH22g02zVJ+hqKtiliw1mg3bP/lOHHTfHADF8cFnZIBCbWAu6XFID+j0R5RJAaZYrnOht9+1c+fjumGg9DDWqgQQAAAIEAoc1c1DCr/HzzPvbX6dfCKdaFtfLJNHCDDyFpuCKPB2NxRjyz5zkgd9ECD5Db2fyiB7rrkVpKgl8MnJ3ERomNMEakr1OsEhnkLz6QnAKkJ27EIvUbiucfxnFY0Mdr+F0O32CpkSFIpOVhHqfa31c2jqBuLUn3A0kvZR5zIvnlviY=", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDX9Txnu9uEFeu0KZgsdvC4y52gHMP2bfEsIr97BRES1SRGlVvZsrGojsWPQfeFjIxtnUYjZv/DvIzys5NaA3SoIa95g/0tFTtlyzf98gQdtW1WMnnIsmnj6zaOoRYUhkEBR20EF1Yg32E7aTDyyGVTK/TVsoH3XeAqwbhznS1EGFxqyJhKP4NccmB0G2bwjxAMEGt/YPunlhmVPiOTFNqKeUc1BiUQQUbEfmkDZW7GTv3YDDw9KDrwHipZwfHyZgq/6A6QlKXlx1ddePP3sey+9i/3o9HUMyrEkfPLZYyiM2LbRMZ/1NOrsDuKxKTt+UeXP8HDWhxx9HNTSg9VR5lsgAH/t8QfxJYnkMpwkfOnqp9a/uVXqAccpQzKPjgbfKdmvbMeQEr4CFnAr8wNEPVdyBGYWC/tCRgAvnyMZ+QX/C/Yr1c0NVdBe24CDYz6txO0kKeJJiuGv0Lw3qlo5+r1MLdsxkN6IHzgJub9C3BG21hxi5jwQiv73IUmZJIf600=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFX6Zs61bYvLLRIbM0riEKY9ACddaTbxcWPyntGg+v54psh7ooWEQSJH51NOypf9DqjWEdfAXxZnbti1GFYn1tA=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAsN4dU//fXTrMOrPWiVNgqCSga1BeU/yxcnkfUXDgWW", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 613, "free": 2926}, "nocache": {"free": 3343, "used": 196}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_version": "4.11.amazon", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec29c17a-e58f-faa8-0635-3aef69167416", "ansible_product_uuid": "ec29c17a-e58f-faa8-0635-3aef69167416", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 405, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263646339072, "block_size": 4096, "block_total": 65533179, "block_available": 64366782, "block_used": 1166397, "inode_total": 131071472, "inode_available": 130996310, "inode_used": 75162, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.11.49 54512 10.31.8.150 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.11.49 54512 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:aa:ca:51:b5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.150", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0"}, "ipv6": [{"address": "fe80::10d4:aaff:feca:51b5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.150", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "macaddress": "12:d4:aa:ca:51:b5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.150"], "ansible_all_ipv6_addresses": ["fe80::10d4:aaff:feca:51b5"], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}} <<< 8129 1726773006.89194: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8129 1726773006.89242: stderr chunk (state=3): >>><<< 8129 1726773006.89249: stdout chunk (state=3): >>><<< 8129 1726773006.89276: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "NA", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-8-150", "ansible_nodename": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "c3e1d9fa8b684f3abae9fbe37bdf8cdf", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "06", "epoch": "1726773006", "date": "2024-09-19", "time": "15:10:06", "iso8601_micro": "2024-09-19T19:10:06.626617Z", "iso8601": "2024-09-19T19:10:06Z", "iso8601_basic": "20240919T151006626617", "iso8601_basic_short": "20240919T151006", "tz": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALFZBDdfqkngrfgrSSO/kOko3izToWrd6nOpiEjdLVtcXzovVx/6erjnSGabbSOWK8kW9F5xCwoGapRqymVUpYt2iP3WjMoraiZe76U/z6uZzecAsA1RHY0a1aNyRnOVuT0MwGPdOrPG0IUEHdcjNFPIyIR7a1a7jQuuK/VUomILAAAAFQCa+HW/TznpkY1Sjf2CFciRg0F35QAAAIBWyN49dx0xdzzoLQwc1selHyVtecBGccevTf1vXKodpJhjh6cK8qY+3hXglj10iG4E6TtsmyqPol1hCZfFivhH22g02zVJ+hqKtiliw1mg3bP/lOHHTfHADF8cFnZIBCbWAu6XFID+j0R5RJAaZYrnOht9+1c+fjumGg9DDWqgQQAAAIEAoc1c1DCr/HzzPvbX6dfCKdaFtfLJNHCDDyFpuCKPB2NxRjyz5zkgd9ECD5Db2fyiB7rrkVpKgl8MnJ3ERomNMEakr1OsEhnkLz6QnAKkJ27EIvUbiucfxnFY0Mdr+F0O32CpkSFIpOVhHqfa31c2jqBuLUn3A0kvZR5zIvnlviY=", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDX9Txnu9uEFeu0KZgsdvC4y52gHMP2bfEsIr97BRES1SRGlVvZsrGojsWPQfeFjIxtnUYjZv/DvIzys5NaA3SoIa95g/0tFTtlyzf98gQdtW1WMnnIsmnj6zaOoRYUhkEBR20EF1Yg32E7aTDyyGVTK/TVsoH3XeAqwbhznS1EGFxqyJhKP4NccmB0G2bwjxAMEGt/YPunlhmVPiOTFNqKeUc1BiUQQUbEfmkDZW7GTv3YDDw9KDrwHipZwfHyZgq/6A6QlKXlx1ddePP3sey+9i/3o9HUMyrEkfPLZYyiM2LbRMZ/1NOrsDuKxKTt+UeXP8HDWhxx9HNTSg9VR5lsgAH/t8QfxJYnkMpwkfOnqp9a/uVXqAccpQzKPjgbfKdmvbMeQEr4CFnAr8wNEPVdyBGYWC/tCRgAvnyMZ+QX/C/Yr1c0NVdBe24CDYz6txO0kKeJJiuGv0Lw3qlo5+r1MLdsxkN6IHzgJub9C3BG21hxi5jwQiv73IUmZJIf600=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFX6Zs61bYvLLRIbM0riEKY9ACddaTbxcWPyntGg+v54psh7ooWEQSJH51NOypf9DqjWEdfAXxZnbti1GFYn1tA=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAsN4dU//fXTrMOrPWiVNgqCSga1BeU/yxcnkfUXDgWW", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 613, "free": 2926}, "nocache": {"free": 3343, "used": 196}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_version": "4.11.amazon", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec29c17a-e58f-faa8-0635-3aef69167416", "ansible_product_uuid": "ec29c17a-e58f-faa8-0635-3aef69167416", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 405, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263646339072, "block_size": 4096, "block_total": 65533179, "block_available": 64366782, "block_used": 1166397, "inode_total": 131071472, "inode_available": 130996310, "inode_used": 75162, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.11.49 54512 10.31.8.150 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.11.49 54512 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:aa:ca:51:b5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.150", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0"}, "ipv6": [{"address": "fe80::10d4:aaff:feca:51b5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.150", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "macaddress": "12:d4:aa:ca:51:b5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.150"], "ansible_all_ipv6_addresses": ["fe80::10d4:aaff:feca:51b5"], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.150 closed. 8129 1726773006.89594: done with _execute_module (setup, {'gather_subset': ['all'], 'gather_timeout': 10, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8129 1726773006.89612: _low_level_execute_command(): starting 8129 1726773006.89620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773005.6495938-8129-38635041727810/ > /dev/null 2>&1 && sleep 0' 8129 1726773006.92598: stderr chunk (state=2): >>><<< 8129 1726773006.92618: stdout chunk (state=2): >>><<< 8129 1726773006.92647: _low_level_execute_command() done: rc=0, stdout=, stderr= 8129 1726773006.92661: handler run complete 8129 1726773006.93267: attempt loop complete, returning result 8129 1726773006.93282: _execute() done 8129 1726773006.93288: dumping result to json 8129 1726773006.93319: done dumping result, returning 8129 1726773006.93332: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12a3200b-1e9d-1dbd-cc52-000000000033] 8129 1726773006.93347: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000033 8129 1726773006.93699: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000033 8129 1726773006.93705: WORKER PROCESS EXITING ok: [managed_node2] 8119 1726773006.94481: no more pending results, returning what we have 8119 1726773006.94492: results queue empty 8119 1726773006.94495: checking for any_errors_fatal 8119 1726773006.94498: done checking for any_errors_fatal 8119 1726773006.94500: checking for max_fail_percentage 8119 1726773006.94503: done checking for max_fail_percentage 8119 1726773006.94504: checking to see if all hosts have failed and the running result is not ok 8119 1726773006.94509: done checking to see if all hosts have failed 8119 1726773006.94511: getting the remaining hosts for this loop 8119 1726773006.94513: done getting the remaining hosts for this loop 8119 1726773006.94522: building list of next tasks for hosts 8119 1726773006.94524: getting the next task for host managed_node2 8119 1726773006.94530: done getting next task for host managed_node2 8119 1726773006.94534: ^ task is: TASK: meta (flush_handlers) 8119 1726773006.94538: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773006.94540: done building task lists 8119 1726773006.94542: counting tasks in each state of execution 8119 1726773006.94546: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773006.94548: advancing hosts in ITERATING_TASKS 8119 1726773006.94550: starting to advance hosts 8119 1726773006.94552: getting the next task for host managed_node2 8119 1726773006.94556: done getting next task for host managed_node2 8119 1726773006.94558: ^ task is: TASK: meta (flush_handlers) 8119 1726773006.94561: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773006.94563: done advancing hosts to next task META: ran handlers 8119 1726773006.94590: done queuing things up, now waiting for results queue to drain 8119 1726773006.94593: results queue empty 8119 1726773006.94595: checking for any_errors_fatal 8119 1726773006.94599: done checking for any_errors_fatal 8119 1726773006.94600: checking for max_fail_percentage 8119 1726773006.94602: done checking for max_fail_percentage 8119 1726773006.94604: checking to see if all hosts have failed and the running result is not ok 8119 1726773006.94606: done checking to see if all hosts have failed 8119 1726773006.94610: getting the remaining hosts for this loop 8119 1726773006.94613: done getting the remaining hosts for this loop 8119 1726773006.94619: building list of next tasks for hosts 8119 1726773006.94621: getting the next task for host managed_node2 8119 1726773006.94627: done getting next task for host managed_node2 8119 1726773006.94629: ^ task is: TASK: Check if system is ostree 8119 1726773006.94632: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773006.94634: done building task lists 8119 1726773006.94636: counting tasks in each state of execution 8119 1726773006.94639: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773006.94641: advancing hosts in ITERATING_TASKS 8119 1726773006.94643: starting to advance hosts 8119 1726773006.94645: getting the next task for host managed_node2 8119 1726773006.94649: done getting next task for host managed_node2 8119 1726773006.94651: ^ task is: TASK: Check if system is ostree 8119 1726773006.94654: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773006.94656: done advancing hosts to next task 8119 1726773006.94664: getting variables 8119 1726773006.94667: in VariableManager get_vars() 8119 1726773006.94696: Calling all_inventory to load vars for managed_node2 8119 1726773006.94703: Calling groups_inventory to load vars for managed_node2 8119 1726773006.94709: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773006.94739: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.94754: Calling all_plugins_play to load vars for managed_node2 8119 1726773006.94771: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.94787: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773006.94804: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.94816: Calling groups_plugins_play to load vars for managed_node2 8119 1726773006.94832: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.94861: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.94885: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773006.95268: done with get_vars() 8119 1726773006.95281: done getting variables 8119 1726773006.95303: sending task start callback, copying the task so we can template it temporarily 8119 1726773006.95309: done copying, going to template now 8119 1726773006.95312: done templating 8119 1726773006.95314: here goes the callback... TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:12 Thursday 19 September 2024 15:10:06 -0400 (0:00:01.477) 0:00:01.509 **** 8119 1726773006.95335: sending task start callback 8119 1726773006.95338: entering _queue_task() for managed_node2/stat 8119 1726773006.95484: worker is 1 (out of 1 available) 8119 1726773006.95523: exiting _queue_task() for managed_node2/stat 8119 1726773006.95590: done queuing things up, now waiting for results queue to drain 8119 1726773006.95595: waiting for pending results... 8188 1726773006.96554: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 8188 1726773006.96620: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000000b 8188 1726773006.96673: calling self._execute() 8188 1726773006.99405: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8188 1726773006.99522: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8188 1726773006.99596: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8188 1726773006.99634: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8188 1726773006.99677: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8188 1726773006.99720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8188 1726773006.99778: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8188 1726773006.99811: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8188 1726773006.99835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8188 1726773006.99951: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8188 1726773006.99975: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8188 1726773007.00000: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8188 1726773007.00440: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8188 1726773007.00499: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8188 1726773007.00516: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8188 1726773007.00535: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8188 1726773007.00542: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8188 1726773007.00654: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8188 1726773007.00674: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8188 1726773007.00705: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8188 1726773007.00722: starting attempt loop 8188 1726773007.00725: running the handler 8188 1726773007.00734: _low_level_execute_command(): starting 8188 1726773007.00739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8188 1726773007.04567: stdout chunk (state=2): >>>/root <<< 8188 1726773007.04586: stderr chunk (state=2): >>><<< 8188 1726773007.04601: stdout chunk (state=3): >>><<< 8188 1726773007.04625: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8188 1726773007.04646: _low_level_execute_command(): starting 8188 1726773007.04655: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598 `" && echo ansible-tmp-1726773007.0463722-8188-267752910466598="` echo /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598 `" ) && sleep 0' 8188 1726773007.07908: stdout chunk (state=2): >>>ansible-tmp-1726773007.0463722-8188-267752910466598=/root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598 <<< 8188 1726773007.07990: stderr chunk (state=3): >>><<< 8188 1726773007.07998: stdout chunk (state=3): >>><<< 8188 1726773007.08027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773007.0463722-8188-267752910466598=/root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598 , stderr= 8188 1726773007.08150: ANSIBALLZ: Using lock for stat 8188 1726773007.08155: ANSIBALLZ: Acquiring lock 8188 1726773007.08160: ANSIBALLZ: Lock acquired: 140408694761600 8188 1726773007.08163: ANSIBALLZ: Creating module 8188 1726773007.22468: ANSIBALLZ: Writing module into payload 8188 1726773007.22597: ANSIBALLZ: Writing module 8188 1726773007.22622: ANSIBALLZ: Renaming module 8188 1726773007.22628: ANSIBALLZ: Done creating module 8188 1726773007.22658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/AnsiballZ_stat.py 8188 1726773007.23465: Sending initial data 8188 1726773007.23479: Sent initial data (151 bytes) 8188 1726773007.26460: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpbifd7bs0 /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/AnsiballZ_stat.py <<< 8188 1726773007.28292: stderr chunk (state=3): >>><<< 8188 1726773007.28300: stdout chunk (state=3): >>><<< 8188 1726773007.28332: done transferring module to remote 8188 1726773007.28354: _low_level_execute_command(): starting 8188 1726773007.28363: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/ /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/AnsiballZ_stat.py && sleep 0' 8188 1726773007.31468: stderr chunk (state=2): >>><<< 8188 1726773007.31485: stdout chunk (state=2): >>><<< 8188 1726773007.31509: _low_level_execute_command() done: rc=0, stdout=, stderr= 8188 1726773007.31516: _low_level_execute_command(): starting 8188 1726773007.31525: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/AnsiballZ_stat.py && sleep 0' 8188 1726773007.46134: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8188 1726773007.47132: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8188 1726773007.47142: stdout chunk (state=3): >>><<< 8188 1726773007.47155: stderr chunk (state=3): >>><<< 8188 1726773007.47176: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 8188 1726773007.47211: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8188 1726773007.47229: _low_level_execute_command(): starting 8188 1726773007.47236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773007.0463722-8188-267752910466598/ > /dev/null 2>&1 && sleep 0' 8188 1726773007.50171: stderr chunk (state=2): >>><<< 8188 1726773007.50187: stdout chunk (state=2): >>><<< 8188 1726773007.50212: _low_level_execute_command() done: rc=0, stdout=, stderr= 8188 1726773007.50222: handler run complete 8188 1726773007.50256: attempt loop complete, returning result 8188 1726773007.50269: _execute() done 8188 1726773007.50273: dumping result to json 8188 1726773007.50277: done dumping result, returning 8188 1726773007.50294: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [12a3200b-1e9d-1dbd-cc52-00000000000b] 8188 1726773007.50312: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000b 8188 1726773007.50367: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000b 8188 1726773007.50372: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8119 1726773007.50830: no more pending results, returning what we have 8119 1726773007.50836: results queue empty 8119 1726773007.50838: checking for any_errors_fatal 8119 1726773007.50842: done checking for any_errors_fatal 8119 1726773007.50844: checking for max_fail_percentage 8119 1726773007.50847: done checking for max_fail_percentage 8119 1726773007.50849: checking to see if all hosts have failed and the running result is not ok 8119 1726773007.50852: done checking to see if all hosts have failed 8119 1726773007.50854: getting the remaining hosts for this loop 8119 1726773007.50857: done getting the remaining hosts for this loop 8119 1726773007.50865: building list of next tasks for hosts 8119 1726773007.50868: getting the next task for host managed_node2 8119 1726773007.50875: done getting next task for host managed_node2 8119 1726773007.50879: ^ task is: TASK: Set flag to indicate system is ostree 8119 1726773007.50885: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773007.50888: done building task lists 8119 1726773007.50891: counting tasks in each state of execution 8119 1726773007.50895: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773007.50897: advancing hosts in ITERATING_TASKS 8119 1726773007.50900: starting to advance hosts 8119 1726773007.50902: getting the next task for host managed_node2 8119 1726773007.50907: done getting next task for host managed_node2 8119 1726773007.50910: ^ task is: TASK: Set flag to indicate system is ostree 8119 1726773007.50913: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773007.50915: done advancing hosts to next task 8119 1726773007.50974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773007.50980: getting variables 8119 1726773007.50985: in VariableManager get_vars() 8119 1726773007.51014: Calling all_inventory to load vars for managed_node2 8119 1726773007.51022: Calling groups_inventory to load vars for managed_node2 8119 1726773007.51026: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773007.51057: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51097: Calling all_plugins_play to load vars for managed_node2 8119 1726773007.51118: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51134: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773007.51153: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51164: Calling groups_plugins_play to load vars for managed_node2 8119 1726773007.51182: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51217: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51242: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.51571: done with get_vars() 8119 1726773007.51585: done getting variables 8119 1726773007.51592: sending task start callback, copying the task so we can template it temporarily 8119 1726773007.51595: done copying, going to template now 8119 1726773007.51598: done templating 8119 1726773007.51600: here goes the callback... TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:17 Thursday 19 September 2024 15:10:07 -0400 (0:00:00.562) 0:00:02.072 **** 8119 1726773007.51622: sending task start callback 8119 1726773007.51625: entering _queue_task() for managed_node2/set_fact 8119 1726773007.51629: Creating lock for set_fact 8119 1726773007.51804: worker is 1 (out of 1 available) 8119 1726773007.51839: exiting _queue_task() for managed_node2/set_fact 8119 1726773007.51907: done queuing things up, now waiting for results queue to drain 8119 1726773007.51912: waiting for pending results... 8218 1726773007.52094: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 8218 1726773007.52153: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000000c 8218 1726773007.52205: calling self._execute() 8218 1726773007.56771: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8218 1726773007.56882: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8218 1726773007.56963: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8218 1726773007.57005: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8218 1726773007.57050: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8218 1726773007.57086: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8218 1726773007.57143: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8218 1726773007.57171: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8218 1726773007.57197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8218 1726773007.57303: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8218 1726773007.57328: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8218 1726773007.57349: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8218 1726773007.57774: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8218 1726773007.57785: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8218 1726773007.57789: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8218 1726773007.57793: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8218 1726773007.57796: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8218 1726773007.57799: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8218 1726773007.57803: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8218 1726773007.57806: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8218 1726773007.57812: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8218 1726773007.57835: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8218 1726773007.57839: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8218 1726773007.57842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8218 1726773007.57881: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8218 1726773007.57935: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8218 1726773007.57946: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8218 1726773007.57961: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8218 1726773007.57967: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8218 1726773007.58073: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8218 1726773007.58081: starting attempt loop 8218 1726773007.58098: running the handler 8218 1726773007.58111: handler run complete 8218 1726773007.58116: attempt loop complete, returning result 8218 1726773007.58119: _execute() done 8218 1726773007.58122: dumping result to json 8218 1726773007.58124: done dumping result, returning 8218 1726773007.58129: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-00000000000c] 8218 1726773007.58136: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000c 8218 1726773007.58157: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000c 8218 1726773007.58160: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8119 1726773007.58379: no more pending results, returning what we have 8119 1726773007.58386: results queue empty 8119 1726773007.58389: checking for any_errors_fatal 8119 1726773007.58394: done checking for any_errors_fatal 8119 1726773007.58396: checking for max_fail_percentage 8119 1726773007.58399: done checking for max_fail_percentage 8119 1726773007.58401: checking to see if all hosts have failed and the running result is not ok 8119 1726773007.58403: done checking to see if all hosts have failed 8119 1726773007.58405: getting the remaining hosts for this loop 8119 1726773007.58408: done getting the remaining hosts for this loop 8119 1726773007.58416: building list of next tasks for hosts 8119 1726773007.58419: getting the next task for host managed_node2 8119 1726773007.58427: done getting next task for host managed_node2 8119 1726773007.58430: ^ task is: TASK: Ensure required packages are installed 8119 1726773007.58433: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773007.58435: done building task lists 8119 1726773007.58437: counting tasks in each state of execution 8119 1726773007.58441: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773007.58443: advancing hosts in ITERATING_TASKS 8119 1726773007.58445: starting to advance hosts 8119 1726773007.58447: getting the next task for host managed_node2 8119 1726773007.58452: done getting next task for host managed_node2 8119 1726773007.58454: ^ task is: TASK: Ensure required packages are installed 8119 1726773007.58456: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773007.58458: done advancing hosts to next task 8119 1726773007.58517: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773007.58523: getting variables 8119 1726773007.58527: in VariableManager get_vars() 8119 1726773007.58555: Calling all_inventory to load vars for managed_node2 8119 1726773007.58561: Calling groups_inventory to load vars for managed_node2 8119 1726773007.58565: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773007.58595: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.58611: Calling all_plugins_play to load vars for managed_node2 8119 1726773007.58628: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.58641: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773007.58657: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.58667: Calling groups_plugins_play to load vars for managed_node2 8119 1726773007.58682: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.58712: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.58736: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773007.59097: done with get_vars() 8119 1726773007.59111: done getting variables 8119 1726773007.59118: sending task start callback, copying the task so we can template it temporarily 8119 1726773007.59121: done copying, going to template now 8119 1726773007.59124: done templating 8119 1726773007.59127: here goes the callback... TASK [Ensure required packages are installed] ********************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 Thursday 19 September 2024 15:10:07 -0400 (0:00:00.075) 0:00:02.147 **** 8119 1726773007.59146: sending task start callback 8119 1726773007.59149: entering _queue_task() for managed_node2/package 8119 1726773007.59152: Creating lock for package 8119 1726773007.59310: worker is 1 (out of 1 available) 8119 1726773007.59344: exiting _queue_task() for managed_node2/package 8119 1726773007.59426: done queuing things up, now waiting for results queue to drain 8119 1726773007.59431: waiting for pending results... 8223 1726773007.59610: running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed 8223 1726773007.59662: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000000d 8223 1726773007.59718: calling self._execute() 8223 1726773007.64311: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8223 1726773007.64420: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8223 1726773007.64498: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8223 1726773007.64538: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8223 1726773007.64578: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8223 1726773007.64615: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8223 1726773007.64667: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8223 1726773007.64698: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8223 1726773007.64723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8223 1726773007.64830: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8223 1726773007.64853: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8223 1726773007.64874: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8223 1726773007.65120: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8223 1726773007.65167: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8223 1726773007.65179: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8223 1726773007.65196: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8223 1726773007.65204: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8223 1726773007.65307: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8223 1726773007.65316: starting attempt loop 8223 1726773007.65319: running the handler 8223 1726773007.65491: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity 8223 1726773007.65508: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory 8223 1726773007.65519: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging 8223 1726773007.65526: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring 8223 1726773007.65578: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools 8223 1726773007.65613: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network 8223 1726773007.65665: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification 8223 1726773007.65718: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging 8223 1726773007.65727: trying /usr/local/lib/python3.9/site-packages/ansible/modules/remote_management 8223 1726773007.65745: trying /usr/local/lib/python3.9/site-packages/ansible/modules/source_control 8223 1726773007.65785: trying /usr/local/lib/python3.9/site-packages/ansible/modules/storage 8223 1726773007.65798: trying /usr/local/lib/python3.9/site-packages/ansible/modules/system 8223 1726773007.65914: trying /usr/local/lib/python3.9/site-packages/ansible/modules/utilities 8223 1726773007.65924: trying /usr/local/lib/python3.9/site-packages/ansible/modules/web_infrastructure 8223 1726773007.65962: trying /usr/local/lib/python3.9/site-packages/ansible/modules/__pycache__ 8223 1726773007.65968: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/alicloud 8223 1726773007.65979: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/amazon 8223 1726773007.66371: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/atomic 8223 1726773007.66386: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/azure 8223 1726773007.66766: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/centurylink 8223 1726773007.66792: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudscale 8223 1726773007.66805: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudstack 8223 1726773007.66894: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/digital_ocean 8223 1726773007.66952: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/dimensiondata 8223 1726773007.66963: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/docker 8223 1726773007.67006: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/google 8223 1726773007.67324: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/hcloud 8223 1726773007.67352: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/heroku 8223 1726773007.67357: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/huawei 8223 1726773007.67362: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/kubevirt 8223 1726773007.67371: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/linode 8223 1726773007.67376: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxc 8223 1726773007.67381: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxd 8223 1726773007.67390: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/memset 8223 1726773007.67409: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/misc 8223 1726773007.67436: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oneandone 8223 1726773007.67452: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/online 8223 1726773007.67464: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/opennebula 8223 1726773007.67479: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/openstack 8223 1726773007.67586: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oracle 8223 1726773007.67596: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovh 8223 1726773007.67606: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovirt 8223 1726773007.67729: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/packet 8223 1726773007.67741: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/podman 8223 1726773007.67751: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/profitbricks 8223 1726773007.67765: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/pubnub 8223 1726773007.67773: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/rackspace 8223 1726773007.67821: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/scaleway 8223 1726773007.67863: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/smartos 8223 1726773007.67878: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/softlayer 8223 1726773007.67914: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/spotinst 8223 1726773007.67925: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/univention 8223 1726773007.67939: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vmware 8223 1726773007.68174: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vultr 8223 1726773007.68268: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/webfaction 8223 1726773007.68284: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/xenserver 8223 1726773007.68296: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/__pycache__ 8223 1726773007.68303: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/alicloud/__pycache__ 8223 1726773007.68311: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/amazon/__pycache__ 8223 1726773007.68538: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/atomic/__pycache__ 8223 1726773007.68548: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/azure/__pycache__ 8223 1726773007.68774: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/centurylink/__pycache__ 8223 1726773007.68792: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudscale/__pycache__ 8223 1726773007.68801: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudstack/__pycache__ 8223 1726773007.68855: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/digital_ocean/__pycache__ 8223 1726773007.68891: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/dimensiondata/__pycache__ 8223 1726773007.68898: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/docker/__pycache__ 8223 1726773007.68924: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/google/__pycache__ 8223 1726773007.69110: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/hcloud/__pycache__ 8223 1726773007.69139: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/heroku/__pycache__ 8223 1726773007.69145: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/huawei/__pycache__ 8223 1726773007.69151: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/kubevirt/__pycache__ 8223 1726773007.69161: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/linode/__pycache__ 8223 1726773007.69168: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxc/__pycache__ 8223 1726773007.69174: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxd/__pycache__ 8223 1726773007.69181: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/memset/__pycache__ 8223 1726773007.69196: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/misc/__pycache__ 8223 1726773007.69214: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oneandone/__pycache__ 8223 1726773007.69226: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/online/__pycache__ 8223 1726773007.69237: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/opennebula/__pycache__ 8223 1726773007.69248: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/openstack/__pycache__ 8223 1726773007.69336: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oracle/__pycache__ 8223 1726773007.69345: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovh/__pycache__ 8223 1726773007.69354: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovirt/__pycache__ 8223 1726773007.69435: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/packet/__pycache__ 8223 1726773007.69446: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/podman/__pycache__ 8223 1726773007.69454: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/profitbricks/__pycache__ 8223 1726773007.69465: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/pubnub/__pycache__ 8223 1726773007.69472: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/rackspace/__pycache__ 8223 1726773007.69504: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/scaleway/__pycache__ 8223 1726773007.69534: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/smartos/__pycache__ 8223 1726773007.69547: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/softlayer/__pycache__ 8223 1726773007.69554: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/spotinst/__pycache__ 8223 1726773007.69561: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/univention/__pycache__ 8223 1726773007.69572: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vmware/__pycache__ 8223 1726773007.69721: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vultr/__pycache__ 8223 1726773007.69769: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/webfaction/__pycache__ 8223 1726773007.69782: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/xenserver/__pycache__ 8223 1726773007.69795: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/k8s 8223 1726773007.69814: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/openshift 8223 1726773007.69826: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/__pycache__ 8223 1726773007.69839: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/k8s/__pycache__ 8223 1726773007.69852: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/openshift/__pycache__ 8223 1726773007.69862: trying /usr/local/lib/python3.9/site-packages/ansible/modules/commands/__pycache__ 8223 1726773007.69874: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/acme 8223 1726773007.69895: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/entrust 8223 1726773007.69904: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/__pycache__ 8223 1726773007.69925: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/acme/__pycache__ 8223 1726773007.69939: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/entrust/__pycache__ 8223 1726773007.69946: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/aerospike 8223 1726773007.69954: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/influxdb 8223 1726773007.69968: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/misc 8223 1726773007.69980: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mongodb 8223 1726773007.69995: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mssql 8223 1726773007.70003: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mysql 8223 1726773007.70017: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/postgresql 8223 1726773007.70053: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/proxysql 8223 1726773007.70070: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/vertica 8223 1726773007.70087: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/__pycache__ 8223 1726773007.70094: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/aerospike/__pycache__ 8223 1726773007.70100: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/influxdb/__pycache__ 8223 1726773007.70111: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/misc/__pycache__ 8223 1726773007.70120: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mongodb/__pycache__ 8223 1726773007.70130: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mssql/__pycache__ 8223 1726773007.70136: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mysql/__pycache__ 8223 1726773007.70147: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/postgresql/__pycache__ 8223 1726773007.70170: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/proxysql/__pycache__ 8223 1726773007.70186: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/vertica/__pycache__ 8223 1726773007.70199: trying /usr/local/lib/python3.9/site-packages/ansible/modules/files/__pycache__ 8223 1726773007.70224: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/cyberark 8223 1726773007.70235: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/ipa 8223 1726773007.70265: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/keycloak 8223 1726773007.70276: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/opendj 8223 1726773007.70287: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/__pycache__ 8223 1726773007.70295: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/cyberark/__pycache__ 8223 1726773007.70303: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/ipa/__pycache__ 8223 1726773007.70323: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/keycloak/__pycache__ 8223 1726773007.70333: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/opendj/__pycache__ 8223 1726773007.70340: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory/__pycache__ 8223 1726773007.70347: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/rabbitmq 8223 1726773007.70368: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/__pycache__ 8223 1726773007.70375: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/rabbitmq/__pycache__ 8223 1726773007.70439: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/zabbix 8223 1726773007.70470: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/__pycache__ 8223 1726773007.70509: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/zabbix/__pycache__ 8223 1726773007.70531: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics 8223 1726773007.70545: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 8223 1726773007.70556: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 8223 1726773007.70564: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 8223 1726773007.70575: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 8223 1726773007.70591: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 8223 1726773007.70621: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 8223 1726773007.70642: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 8223 1726773007.70652: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 8223 1726773007.70660: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 8223 1726773007.70666: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 8223 1726773007.70675: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 8223 1726773007.70687: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 8223 1726773007.70708: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 8223 1726773007.70722: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 8223 1726773007.70887: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 8223 1726773007.70899: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 8223 1726773007.70922: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 8223 1726773007.70933: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 8223 1726773007.70945: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 8223 1726773007.71051: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 8223 1726773007.71064: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 8223 1726773007.71219: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 8223 1726773007.71229: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 8223 1726773007.71239: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 8223 1726773007.71342: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 8223 1726773007.71352: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 8223 1726773007.71401: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 8223 1726773007.71422: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 8223 1726773007.71433: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 8223 1726773007.71444: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 8223 1726773007.71455: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 8223 1726773007.71467: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 8223 1726773007.71476: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 8223 1726773007.71514: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 8223 1726773007.71562: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 8223 1726773007.71572: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 8223 1726773007.71586: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 8223 1726773007.71852: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 8223 1726773007.71863: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 8223 1726773007.71871: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 8223 1726773007.71919: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 8223 1726773007.72446: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 8223 1726773007.72458: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 8223 1726773007.72470: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 8223 1726773007.72501: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 8223 1726773007.72524: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 8223 1726773007.72534: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 8223 1726773007.72545: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 8223 1726773007.72593: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 8223 1726773007.72626: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 8223 1726773007.72638: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 8223 1726773007.72648: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 8223 1726773007.72700: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 8223 1726773007.72711: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 8223 1726773007.72721: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 8223 1726773007.72756: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 8223 1726773007.72764: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 8223 1726773007.72776: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 8223 1726773007.72804: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 8223 1726773007.72889: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 8223 1726773007.72902: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 8223 1726773007.72916: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 8223 1726773007.72924: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 8223 1726773007.73051: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 8223 1726773007.73098: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 8223 1726773007.73108: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 8223 1726773007.73118: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 8223 1726773007.73128: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 8223 1726773007.73165: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 8223 1726773007.73172: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 8223 1726773007.73186: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 8223 1726773007.73196: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 8223 1726773007.73205: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 8223 1726773007.73213: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 8223 1726773007.73224: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 8223 1726773007.73245: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 8223 1726773007.73257: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 8223 1726773007.73271: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 8223 1726773007.73284: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 8223 1726773007.73322: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 8223 1726773007.73329: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 8223 1726773007.73338: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 8223 1726773007.73449: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 8223 1726773007.73459: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 8223 1726773007.73477: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 8223 1726773007.73487: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 8223 1726773007.73497: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 8223 1726773007.73566: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 8223 1726773007.73576: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 8223 1726773007.73678: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 8223 1726773007.73690: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 8223 1726773007.73698: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 8223 1726773007.73767: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 8223 1726773007.73776: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 8223 1726773007.73809: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 8223 1726773007.73824: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 8223 1726773007.73833: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 8223 1726773007.73842: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 8223 1726773007.73850: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 8223 1726773007.73859: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 8223 1726773007.73866: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 8223 1726773007.73875: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 8223 1726773007.73907: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 8223 1726773007.73916: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 8223 1726773007.73926: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 8223 1726773007.74087: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 8223 1726773007.74097: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 8223 1726773007.74104: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 8223 1726773007.74135: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 8223 1726773007.74546: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 8223 1726773007.74556: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 8223 1726773007.74566: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 8223 1726773007.74587: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 8223 1726773007.74604: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 8223 1726773007.74612: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 8223 1726773007.74621: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 8223 1726773007.74652: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 8223 1726773007.74674: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 8223 1726773007.74686: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 8223 1726773007.74694: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 8223 1726773007.74729: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 8223 1726773007.74737: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 8223 1726773007.74744: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 8223 1726773007.74768: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 8223 1726773007.74775: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 8223 1726773007.74785: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 8223 1726773007.74805: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 8223 1726773007.74857: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 8223 1726773007.74868: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 8223 1726773007.74879: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 8223 1726773007.74889: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 8223 1726773007.74970: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 8223 1726773007.75004: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 8223 1726773007.75013: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 8223 1726773007.75020: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 8223 1726773007.75028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 8223 1726773007.75056: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 8223 1726773007.75064: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 8223 1726773007.75072: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 8223 1726773007.75080: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 8223 1726773007.75089: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 8223 1726773007.75097: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 8223 1726773007.75105: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 8223 1726773007.75120: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 8223 1726773007.75129: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 8223 1726773007.75140: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 8223 1726773007.75148: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 8223 1726773007.75173: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 8223 1726773007.75208: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 8223 1726773007.75235: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 8223 1726773007.75319: _low_level_execute_command(): starting 8223 1726773007.75326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8223 1726773007.78693: stdout chunk (state=2): >>>/root <<< 8223 1726773007.78705: stderr chunk (state=2): >>><<< 8223 1726773007.78716: stdout chunk (state=3): >>><<< 8223 1726773007.78733: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8223 1726773007.78752: _low_level_execute_command(): starting 8223 1726773007.78759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592 `" && echo ansible-tmp-1726773007.78744-8223-68537326430592="` echo /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592 `" ) && sleep 0' 8223 1726773007.81670: stdout chunk (state=2): >>>ansible-tmp-1726773007.78744-8223-68537326430592=/root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592 <<< 8223 1726773007.81795: stderr chunk (state=3): >>><<< 8223 1726773007.81802: stdout chunk (state=3): >>><<< 8223 1726773007.81835: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773007.78744-8223-68537326430592=/root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592 , stderr= 8223 1726773007.81973: ANSIBALLZ: Using lock for dnf 8223 1726773007.81978: ANSIBALLZ: Acquiring lock 8223 1726773007.81982: ANSIBALLZ: Lock acquired: 140408693992464 8223 1726773007.81987: ANSIBALLZ: Creating module 8223 1726773007.91756: ANSIBALLZ: Writing module into payload 8223 1726773007.91966: ANSIBALLZ: Writing module 8223 1726773007.91986: ANSIBALLZ: Renaming module 8223 1726773007.91990: ANSIBALLZ: Done creating module 8223 1726773007.92023: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/AnsiballZ_dnf.py 8223 1726773007.92363: Sending initial data 8223 1726773007.92378: Sent initial data (147 bytes) 8223 1726773007.94959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpf3r7miyx /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/AnsiballZ_dnf.py <<< 8223 1726773007.96166: stderr chunk (state=3): >>><<< 8223 1726773007.96172: stdout chunk (state=3): >>><<< 8223 1726773007.96198: done transferring module to remote 8223 1726773007.96215: _low_level_execute_command(): starting 8223 1726773007.96223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/ /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/AnsiballZ_dnf.py && sleep 0' 8223 1726773007.98824: stderr chunk (state=2): >>><<< 8223 1726773007.98836: stdout chunk (state=2): >>><<< 8223 1726773007.98857: _low_level_execute_command() done: rc=0, stdout=, stderr= 8223 1726773007.98863: _low_level_execute_command(): starting 8223 1726773007.98871: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/AnsiballZ_dnf.py && sleep 0' 8223 1726773022.03254: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8223 1726773022.06520: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8223 1726773022.06576: stderr chunk (state=3): >>><<< 8223 1726773022.06587: stdout chunk (state=3): >>><<< 8223 1726773022.06611: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8223 1726773022.06646: done with _execute_module (dnf, {'name': ['tuned', 'procps-ng'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8223 1726773022.06655: _low_level_execute_command(): starting 8223 1726773022.06660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773007.78744-8223-68537326430592/ > /dev/null 2>&1 && sleep 0' 8223 1726773022.09388: stderr chunk (state=2): >>><<< 8223 1726773022.09404: stdout chunk (state=2): >>><<< 8223 1726773022.09426: _low_level_execute_command() done: rc=0, stdout=, stderr= 8223 1726773022.09437: handler run complete 8223 1726773022.09443: attempt loop complete, returning result 8223 1726773022.09453: _execute() done 8223 1726773022.09455: dumping result to json 8223 1726773022.09460: done dumping result, returning 8223 1726773022.09473: done running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-00000000000d] 8223 1726773022.09492: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000d 8223 1726773022.09532: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000d 8223 1726773022.09537: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773022.09781: no more pending results, returning what we have 8119 1726773022.09788: results queue empty 8119 1726773022.09791: checking for any_errors_fatal 8119 1726773022.09795: done checking for any_errors_fatal 8119 1726773022.09797: checking for max_fail_percentage 8119 1726773022.09799: done checking for max_fail_percentage 8119 1726773022.09801: checking to see if all hosts have failed and the running result is not ok 8119 1726773022.09803: done checking to see if all hosts have failed 8119 1726773022.09805: getting the remaining hosts for this loop 8119 1726773022.09810: done getting the remaining hosts for this loop 8119 1726773022.09818: building list of next tasks for hosts 8119 1726773022.09821: getting the next task for host managed_node2 8119 1726773022.09825: done getting next task for host managed_node2 8119 1726773022.09828: ^ task is: TASK: See if tuned has a profile subdir 8119 1726773022.09831: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.09833: done building task lists 8119 1726773022.09835: counting tasks in each state of execution 8119 1726773022.09839: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773022.09841: advancing hosts in ITERATING_TASKS 8119 1726773022.09843: starting to advance hosts 8119 1726773022.09845: getting the next task for host managed_node2 8119 1726773022.09848: done getting next task for host managed_node2 8119 1726773022.09850: ^ task is: TASK: See if tuned has a profile subdir 8119 1726773022.09853: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.09855: done advancing hosts to next task 8119 1726773022.09866: getting variables 8119 1726773022.09869: in VariableManager get_vars() 8119 1726773022.09893: Calling all_inventory to load vars for managed_node2 8119 1726773022.09899: Calling groups_inventory to load vars for managed_node2 8119 1726773022.09903: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773022.09934: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.09946: Calling all_plugins_play to load vars for managed_node2 8119 1726773022.09958: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.09966: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773022.09976: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.09984: Calling groups_plugins_play to load vars for managed_node2 8119 1726773022.10000: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.10023: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.10037: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.10240: done with get_vars() 8119 1726773022.10249: done getting variables 8119 1726773022.10253: sending task start callback, copying the task so we can template it temporarily 8119 1726773022.10255: done copying, going to template now 8119 1726773022.10257: done templating 8119 1726773022.10259: here goes the callback... TASK [See if tuned has a profile subdir] *************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:28 Thursday 19 September 2024 15:10:22 -0400 (0:00:14.511) 0:00:16.659 **** 8119 1726773022.10274: sending task start callback 8119 1726773022.10277: entering _queue_task() for managed_node2/stat 8119 1726773022.10403: worker is 1 (out of 1 available) 8119 1726773022.10444: exiting _queue_task() for managed_node2/stat 8119 1726773022.10516: done queuing things up, now waiting for results queue to drain 8119 1726773022.10522: waiting for pending results... 8402 1726773022.10556: running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir 8402 1726773022.10604: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000000e 8402 1726773022.10648: calling self._execute() 8402 1726773022.10819: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8402 1726773022.10859: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8402 1726773022.10871: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8402 1726773022.10886: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8402 1726773022.10894: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8402 1726773022.11010: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8402 1726773022.11026: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8402 1726773022.11077: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8402 1726773022.11094: starting attempt loop 8402 1726773022.11096: running the handler 8402 1726773022.11107: _low_level_execute_command(): starting 8402 1726773022.11113: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8402 1726773022.13637: stdout chunk (state=2): >>>/root <<< 8402 1726773022.13759: stderr chunk (state=3): >>><<< 8402 1726773022.13765: stdout chunk (state=3): >>><<< 8402 1726773022.13791: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8402 1726773022.13808: _low_level_execute_command(): starting 8402 1726773022.13815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202 `" && echo ansible-tmp-1726773022.1380076-8402-35537928200202="` echo /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202 `" ) && sleep 0' 8402 1726773022.16484: stdout chunk (state=2): >>>ansible-tmp-1726773022.1380076-8402-35537928200202=/root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202 <<< 8402 1726773022.16623: stderr chunk (state=3): >>><<< 8402 1726773022.16629: stdout chunk (state=3): >>><<< 8402 1726773022.16649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773022.1380076-8402-35537928200202=/root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202 , stderr= 8402 1726773022.16756: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8402 1726773022.16820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/AnsiballZ_stat.py 8402 1726773022.17219: Sending initial data 8402 1726773022.17233: Sent initial data (150 bytes) 8402 1726773022.19643: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpgo6r7e8h /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/AnsiballZ_stat.py <<< 8402 1726773022.20646: stderr chunk (state=3): >>><<< 8402 1726773022.20653: stdout chunk (state=3): >>><<< 8402 1726773022.20677: done transferring module to remote 8402 1726773022.20693: _low_level_execute_command(): starting 8402 1726773022.20698: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/ /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/AnsiballZ_stat.py && sleep 0' 8402 1726773022.23323: stderr chunk (state=2): >>><<< 8402 1726773022.23340: stdout chunk (state=2): >>><<< 8402 1726773022.23363: _low_level_execute_command() done: rc=0, stdout=, stderr= 8402 1726773022.23367: _low_level_execute_command(): starting 8402 1726773022.23373: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/AnsiballZ_stat.py && sleep 0' 8402 1726773022.38031: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8402 1726773022.39020: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8402 1726773022.39076: stderr chunk (state=3): >>><<< 8402 1726773022.39082: stdout chunk (state=3): >>><<< 8402 1726773022.39105: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 8402 1726773022.39133: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8402 1726773022.39146: _low_level_execute_command(): starting 8402 1726773022.39153: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773022.1380076-8402-35537928200202/ > /dev/null 2>&1 && sleep 0' 8402 1726773022.41817: stderr chunk (state=2): >>><<< 8402 1726773022.41830: stdout chunk (state=2): >>><<< 8402 1726773022.41853: _low_level_execute_command() done: rc=0, stdout=, stderr= 8402 1726773022.41861: handler run complete 8402 1726773022.41899: attempt loop complete, returning result 8402 1726773022.41911: _execute() done 8402 1726773022.41915: dumping result to json 8402 1726773022.41918: done dumping result, returning 8402 1726773022.41927: done running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir [12a3200b-1e9d-1dbd-cc52-00000000000e] 8402 1726773022.41941: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000e 8402 1726773022.41980: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000e 8402 1726773022.41986: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8119 1726773022.42192: no more pending results, returning what we have 8119 1726773022.42198: results queue empty 8119 1726773022.42200: checking for any_errors_fatal 8119 1726773022.42206: done checking for any_errors_fatal 8119 1726773022.42210: checking for max_fail_percentage 8119 1726773022.42213: done checking for max_fail_percentage 8119 1726773022.42215: checking to see if all hosts have failed and the running result is not ok 8119 1726773022.42217: done checking to see if all hosts have failed 8119 1726773022.42219: getting the remaining hosts for this loop 8119 1726773022.42221: done getting the remaining hosts for this loop 8119 1726773022.42229: building list of next tasks for hosts 8119 1726773022.42232: getting the next task for host managed_node2 8119 1726773022.42237: done getting next task for host managed_node2 8119 1726773022.42240: ^ task is: TASK: Set profile dir 8119 1726773022.42244: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.42246: done building task lists 8119 1726773022.42248: counting tasks in each state of execution 8119 1726773022.42250: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773022.42252: advancing hosts in ITERATING_TASKS 8119 1726773022.42253: starting to advance hosts 8119 1726773022.42255: getting the next task for host managed_node2 8119 1726773022.42257: done getting next task for host managed_node2 8119 1726773022.42258: ^ task is: TASK: Set profile dir 8119 1726773022.42260: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.42261: done advancing hosts to next task 8119 1726773022.42273: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773022.42275: getting variables 8119 1726773022.42277: in VariableManager get_vars() 8119 1726773022.42305: Calling all_inventory to load vars for managed_node2 8119 1726773022.42314: Calling groups_inventory to load vars for managed_node2 8119 1726773022.42317: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773022.42341: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42351: Calling all_plugins_play to load vars for managed_node2 8119 1726773022.42365: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42376: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773022.42389: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42397: Calling groups_plugins_play to load vars for managed_node2 8119 1726773022.42413: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42449: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.42666: done with get_vars() 8119 1726773022.42676: done getting variables 8119 1726773022.42680: sending task start callback, copying the task so we can template it temporarily 8119 1726773022.42681: done copying, going to template now 8119 1726773022.42686: done templating 8119 1726773022.42687: here goes the callback... TASK [Set profile dir] ********************************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:33 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.324) 0:00:16.983 **** 8119 1726773022.42704: sending task start callback 8119 1726773022.42706: entering _queue_task() for managed_node2/set_fact 8119 1726773022.42826: worker is 1 (out of 1 available) 8119 1726773022.42864: exiting _queue_task() for managed_node2/set_fact 8119 1726773022.42935: done queuing things up, now waiting for results queue to drain 8119 1726773022.42941: waiting for pending results... 8425 1726773022.42976: running TaskExecutor() for managed_node2/TASK: Set profile dir 8425 1726773022.43020: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000000f 8425 1726773022.43063: calling self._execute() 8425 1726773022.44928: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8425 1726773022.45035: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8425 1726773022.45103: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8425 1726773022.45145: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8425 1726773022.45185: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8425 1726773022.45222: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8425 1726773022.45275: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8425 1726773022.45310: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8425 1726773022.45336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8425 1726773022.45535: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8425 1726773022.45560: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8425 1726773022.45580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8425 1726773022.45932: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8425 1726773022.45972: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8425 1726773022.45982: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8425 1726773022.45995: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8425 1726773022.46000: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8425 1726773022.46113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8425 1726773022.46125: starting attempt loop 8425 1726773022.46127: running the handler 8425 1726773022.46139: handler run complete 8425 1726773022.46142: attempt loop complete, returning result 8425 1726773022.46144: _execute() done 8425 1726773022.46146: dumping result to json 8425 1726773022.46148: done dumping result, returning 8425 1726773022.46153: done running TaskExecutor() for managed_node2/TASK: Set profile dir [12a3200b-1e9d-1dbd-cc52-00000000000f] 8425 1726773022.46161: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000f 8425 1726773022.46190: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000000f 8425 1726773022.46194: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__profile_dir": "/etc/tuned/kernel_settings" }, "changed": false } 8119 1726773022.46412: no more pending results, returning what we have 8119 1726773022.46417: results queue empty 8119 1726773022.46419: checking for any_errors_fatal 8119 1726773022.46425: done checking for any_errors_fatal 8119 1726773022.46427: checking for max_fail_percentage 8119 1726773022.46429: done checking for max_fail_percentage 8119 1726773022.46431: checking to see if all hosts have failed and the running result is not ok 8119 1726773022.46433: done checking to see if all hosts have failed 8119 1726773022.46435: getting the remaining hosts for this loop 8119 1726773022.46437: done getting the remaining hosts for this loop 8119 1726773022.46443: building list of next tasks for hosts 8119 1726773022.46445: getting the next task for host managed_node2 8119 1726773022.46450: done getting next task for host managed_node2 8119 1726773022.46452: ^ task is: TASK: Ensure kernel settings profile directory exists 8119 1726773022.46454: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.46455: done building task lists 8119 1726773022.46457: counting tasks in each state of execution 8119 1726773022.46459: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773022.46461: advancing hosts in ITERATING_TASKS 8119 1726773022.46462: starting to advance hosts 8119 1726773022.46463: getting the next task for host managed_node2 8119 1726773022.46465: done getting next task for host managed_node2 8119 1726773022.46467: ^ task is: TASK: Ensure kernel settings profile directory exists 8119 1726773022.46468: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773022.46469: done advancing hosts to next task 8119 1726773022.46479: getting variables 8119 1726773022.46481: in VariableManager get_vars() 8119 1726773022.46510: Calling all_inventory to load vars for managed_node2 8119 1726773022.46516: Calling groups_inventory to load vars for managed_node2 8119 1726773022.46518: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773022.46539: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46549: Calling all_plugins_play to load vars for managed_node2 8119 1726773022.46559: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46567: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773022.46577: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46586: Calling groups_plugins_play to load vars for managed_node2 8119 1726773022.46596: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46637: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773022.46843: done with get_vars() 8119 1726773022.46853: done getting variables 8119 1726773022.46857: sending task start callback, copying the task so we can template it temporarily 8119 1726773022.46858: done copying, going to template now 8119 1726773022.46860: done templating 8119 1726773022.46861: here goes the callback... TASK [Ensure kernel settings profile directory exists] ************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:39 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.041) 0:00:17.025 **** 8119 1726773022.46876: sending task start callback 8119 1726773022.46877: entering _queue_task() for managed_node2/file 8119 1726773022.47000: worker is 1 (out of 1 available) 8119 1726773022.47040: exiting _queue_task() for managed_node2/file 8119 1726773022.47111: done queuing things up, now waiting for results queue to drain 8119 1726773022.47118: waiting for pending results... 8428 1726773022.47154: running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists 8428 1726773022.47200: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000010 8428 1726773022.47242: calling self._execute() 8428 1726773022.49457: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8428 1726773022.49558: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8428 1726773022.49622: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8428 1726773022.49658: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8428 1726773022.49696: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8428 1726773022.49736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8428 1726773022.49793: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8428 1726773022.49822: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8428 1726773022.49862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8428 1726773022.49989: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8428 1726773022.50015: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8428 1726773022.50037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8428 1726773022.50199: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8428 1726773022.50233: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8428 1726773022.50244: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8428 1726773022.50261: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8428 1726773022.50269: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8428 1726773022.50353: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8428 1726773022.50372: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8428 1726773022.50405: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8428 1726773022.50416: starting attempt loop 8428 1726773022.50420: running the handler 8428 1726773022.50430: _low_level_execute_command(): starting 8428 1726773022.50436: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8428 1726773022.53426: stdout chunk (state=2): >>>/root <<< 8428 1726773022.53575: stderr chunk (state=3): >>><<< 8428 1726773022.53585: stdout chunk (state=3): >>><<< 8428 1726773022.53615: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8428 1726773022.53635: _low_level_execute_command(): starting 8428 1726773022.53643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787 `" && echo ansible-tmp-1726773022.5362763-8428-163992482275787="` echo /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787 `" ) && sleep 0' 8428 1726773022.57419: stdout chunk (state=2): >>>ansible-tmp-1726773022.5362763-8428-163992482275787=/root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787 <<< 8428 1726773022.57657: stderr chunk (state=3): >>><<< 8428 1726773022.57666: stdout chunk (state=3): >>><<< 8428 1726773022.57697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773022.5362763-8428-163992482275787=/root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787 , stderr= 8428 1726773022.57809: ANSIBALLZ: Using lock for file 8428 1726773022.57814: ANSIBALLZ: Acquiring lock 8428 1726773022.57819: ANSIBALLZ: Lock acquired: 140408693991984 8428 1726773022.57822: ANSIBALLZ: Creating module 8428 1726773022.70466: ANSIBALLZ: Writing module into payload 8428 1726773022.70657: ANSIBALLZ: Writing module 8428 1726773022.70679: ANSIBALLZ: Renaming module 8428 1726773022.70686: ANSIBALLZ: Done creating module 8428 1726773022.70721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/AnsiballZ_file.py 8428 1726773022.71555: Sending initial data 8428 1726773022.71571: Sent initial data (151 bytes) 8428 1726773022.75252: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpofns0e76 /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/AnsiballZ_file.py <<< 8428 1726773022.77322: stderr chunk (state=3): >>><<< 8428 1726773022.77332: stdout chunk (state=3): >>><<< 8428 1726773022.77363: done transferring module to remote 8428 1726773022.77384: _low_level_execute_command(): starting 8428 1726773022.77392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/ /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/AnsiballZ_file.py && sleep 0' 8428 1726773022.80542: stderr chunk (state=2): >>><<< 8428 1726773022.80559: stdout chunk (state=2): >>><<< 8428 1726773022.80589: _low_level_execute_command() done: rc=0, stdout=, stderr= 8428 1726773022.80597: _low_level_execute_command(): starting 8428 1726773022.80606: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/AnsiballZ_file.py && sleep 0' 8428 1726773022.96641: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 8428 1726773022.97736: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8428 1726773022.97743: stdout chunk (state=3): >>><<< 8428 1726773022.97756: stderr chunk (state=3): >>><<< 8428 1726773022.97775: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8428 1726773022.97819: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8428 1726773022.97833: _low_level_execute_command(): starting 8428 1726773022.97839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773022.5362763-8428-163992482275787/ > /dev/null 2>&1 && sleep 0' 8428 1726773023.00626: stderr chunk (state=2): >>><<< 8428 1726773023.00638: stdout chunk (state=2): >>><<< 8428 1726773023.00657: _low_level_execute_command() done: rc=0, stdout=, stderr= 8428 1726773023.00663: handler run complete 8428 1726773023.00668: attempt loop complete, returning result 8428 1726773023.00677: _execute() done 8428 1726773023.00679: dumping result to json 8428 1726773023.00685: done dumping result, returning 8428 1726773023.00698: done running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-000000000010] 8428 1726773023.00714: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000010 8428 1726773023.00754: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000010 8428 1726773023.00758: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8119 1726773023.00920: no more pending results, returning what we have 8119 1726773023.00923: results queue empty 8119 1726773023.00925: checking for any_errors_fatal 8119 1726773023.00928: done checking for any_errors_fatal 8119 1726773023.00929: checking for max_fail_percentage 8119 1726773023.00931: done checking for max_fail_percentage 8119 1726773023.00932: checking to see if all hosts have failed and the running result is not ok 8119 1726773023.00934: done checking to see if all hosts have failed 8119 1726773023.00935: getting the remaining hosts for this loop 8119 1726773023.00937: done getting the remaining hosts for this loop 8119 1726773023.00943: building list of next tasks for hosts 8119 1726773023.00945: getting the next task for host managed_node2 8119 1726773023.00951: done getting next task for host managed_node2 8119 1726773023.00954: ^ task is: TASK: Generate a configuration for kernel settings 8119 1726773023.00957: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773023.00959: done building task lists 8119 1726773023.00961: counting tasks in each state of execution 8119 1726773023.00965: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773023.00967: advancing hosts in ITERATING_TASKS 8119 1726773023.00969: starting to advance hosts 8119 1726773023.00970: getting the next task for host managed_node2 8119 1726773023.00972: done getting next task for host managed_node2 8119 1726773023.00974: ^ task is: TASK: Generate a configuration for kernel settings 8119 1726773023.00976: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773023.00977: done advancing hosts to next task 8119 1726773023.01057: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773023.01063: getting variables 8119 1726773023.01066: in VariableManager get_vars() 8119 1726773023.01094: Calling all_inventory to load vars for managed_node2 8119 1726773023.01101: Calling groups_inventory to load vars for managed_node2 8119 1726773023.01104: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773023.01132: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01146: Calling all_plugins_play to load vars for managed_node2 8119 1726773023.01161: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01174: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773023.01193: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01203: Calling groups_plugins_play to load vars for managed_node2 8119 1726773023.01220: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01250: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01272: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.01478: done with get_vars() 8119 1726773023.01491: done getting variables 8119 1726773023.01496: sending task start callback, copying the task so we can template it temporarily 8119 1726773023.01498: done copying, going to template now 8119 1726773023.01499: done templating 8119 1726773023.01501: here goes the callback... TASK [Generate a configuration for kernel settings] **************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 Thursday 19 September 2024 15:10:23 -0400 (0:00:00.546) 0:00:17.571 **** 8119 1726773023.01516: sending task start callback 8119 1726773023.01518: entering _queue_task() for managed_node2/copy 8119 1726773023.01635: worker is 1 (out of 1 available) 8119 1726773023.01674: exiting _queue_task() for managed_node2/copy 8119 1726773023.01744: done queuing things up, now waiting for results queue to drain 8119 1726773023.01750: waiting for pending results... 8446 1726773023.01788: running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings 8446 1726773023.01831: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000011 8446 1726773023.01873: calling self._execute() 8446 1726773023.03685: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8446 1726773023.03773: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8446 1726773023.03823: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8446 1726773023.03855: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8446 1726773023.03888: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8446 1726773023.03927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8446 1726773023.03977: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8446 1726773023.04005: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8446 1726773023.04023: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8446 1726773023.04106: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8446 1726773023.04125: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8446 1726773023.04138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8446 1726773023.04301: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8446 1726773023.04342: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8446 1726773023.04353: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8446 1726773023.04363: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8446 1726773023.04368: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8446 1726773023.04467: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8446 1726773023.04478: starting attempt loop 8446 1726773023.04480: running the handler 8446 1726773023.04488: _low_level_execute_command(): starting 8446 1726773023.04492: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8446 1726773023.06998: stdout chunk (state=2): >>>/root <<< 8446 1726773023.07125: stderr chunk (state=3): >>><<< 8446 1726773023.07130: stdout chunk (state=3): >>><<< 8446 1726773023.07155: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8446 1726773023.07169: _low_level_execute_command(): starting 8446 1726773023.07175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878 `" && echo ansible-tmp-1726773023.0716383-8446-53511542983878="` echo /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878 `" ) && sleep 0' 8446 1726773023.09835: stdout chunk (state=2): >>>ansible-tmp-1726773023.0716383-8446-53511542983878=/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878 <<< 8446 1726773023.09963: stderr chunk (state=3): >>><<< 8446 1726773023.09970: stdout chunk (state=3): >>><<< 8446 1726773023.09990: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773023.0716383-8446-53511542983878=/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878 , stderr= 8446 1726773023.10008: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 8446 1726773023.10033: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf 8446 1726773023.10133: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8446 1726773023.10189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_stat.py 8446 1726773023.10481: Sending initial data 8446 1726773023.10500: Sent initial data (150 bytes) 8446 1726773023.13033: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpbeblft50 /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_stat.py <<< 8446 1726773023.14038: stderr chunk (state=3): >>><<< 8446 1726773023.14046: stdout chunk (state=3): >>><<< 8446 1726773023.14068: done transferring module to remote 8446 1726773023.14082: _low_level_execute_command(): starting 8446 1726773023.14088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/ /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_stat.py && sleep 0' 8446 1726773023.16751: stderr chunk (state=2): >>><<< 8446 1726773023.16766: stdout chunk (state=2): >>><<< 8446 1726773023.16788: _low_level_execute_command() done: rc=0, stdout=, stderr= 8446 1726773023.16792: _low_level_execute_command(): starting 8446 1726773023.16798: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_stat.py && sleep 0' 8446 1726773023.31790: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 8446 1726773023.34205: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8446 1726773023.34222: stdout chunk (state=3): >>><<< 8446 1726773023.34237: stderr chunk (state=3): >>><<< 8446 1726773023.34258: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 8446 1726773023.34294: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8446 1726773023.35272: Sending initial data 8446 1726773023.35296: Sent initial data (213 bytes) 8446 1726773023.38122: stdout chunk (state=3): >>>sftp> put /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source <<< 8446 1726773023.38414: stderr chunk (state=3): >>><<< 8446 1726773023.38421: stdout chunk (state=3): >>><<< 8446 1726773023.38446: _low_level_execute_command(): starting 8446 1726773023.38451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/ /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source && sleep 0' 8446 1726773023.41104: stderr chunk (state=2): >>><<< 8446 1726773023.41120: stdout chunk (state=2): >>><<< 8446 1726773023.41145: _low_level_execute_command() done: rc=0, stdout=, stderr= 8446 1726773023.41293: ANSIBALLZ: Using lock for copy 8446 1726773023.41298: ANSIBALLZ: Acquiring lock 8446 1726773023.41303: ANSIBALLZ: Lock acquired: 140408693992368 8446 1726773023.41306: ANSIBALLZ: Creating module 8446 1726773023.54801: ANSIBALLZ: Writing module into payload 8446 1726773023.54954: ANSIBALLZ: Writing module 8446 1726773023.54968: ANSIBALLZ: Renaming module 8446 1726773023.54972: ANSIBALLZ: Done creating module 8446 1726773023.54998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_copy.py 8446 1726773023.55650: Sending initial data 8446 1726773023.55664: Sent initial data (150 bytes) 8446 1726773023.58223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqal37_ek /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_copy.py <<< 8446 1726773023.59501: stderr chunk (state=3): >>><<< 8446 1726773023.59509: stdout chunk (state=3): >>><<< 8446 1726773023.59545: done transferring module to remote 8446 1726773023.59562: _low_level_execute_command(): starting 8446 1726773023.59569: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/ /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_copy.py && sleep 0' 8446 1726773023.62423: stderr chunk (state=2): >>><<< 8446 1726773023.62440: stdout chunk (state=2): >>><<< 8446 1726773023.62466: _low_level_execute_command() done: rc=0, stdout=, stderr= 8446 1726773023.62471: _low_level_execute_command(): starting 8446 1726773023.62478: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/AnsiballZ_copy.py && sleep 0' 8446 1726773023.78658: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 8446 1726773023.79748: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8446 1726773023.80092: stderr chunk (state=3): >>><<< 8446 1726773023.80099: stdout chunk (state=3): >>><<< 8446 1726773023.80131: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8446 1726773023.80178: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', '_original_basename': 'tuned.conf', 'follow': False, 'checksum': '13fdc203370e2b8e7e42c13d94b671b1ac621563', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8446 1726773023.80196: _low_level_execute_command(): starting 8446 1726773023.80202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/ > /dev/null 2>&1 && sleep 0' 8446 1726773023.83179: stderr chunk (state=2): >>><<< 8446 1726773023.83194: stdout chunk (state=2): >>><<< 8446 1726773023.83220: _low_level_execute_command() done: rc=0, stdout=, stderr= 8446 1726773023.83231: handler run complete 8446 1726773023.83238: attempt loop complete, returning result 8446 1726773023.83251: _execute() done 8446 1726773023.83254: dumping result to json 8446 1726773023.83261: done dumping result, returning 8446 1726773023.83275: done running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings [12a3200b-1e9d-1dbd-cc52-000000000011] 8446 1726773023.83296: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000011 8446 1726773023.83334: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000011 8446 1726773023.83339: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "d5df32baf1a63528844555117ead6672", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "src": "/root/.ansible/tmp/ansible-tmp-1726773023.0716383-8446-53511542983878/source", "state": "file", "uid": 0 } 8119 1726773023.83791: no more pending results, returning what we have 8119 1726773023.83797: results queue empty 8119 1726773023.83800: checking for any_errors_fatal 8119 1726773023.83806: done checking for any_errors_fatal 8119 1726773023.83808: checking for max_fail_percentage 8119 1726773023.83811: done checking for max_fail_percentage 8119 1726773023.83813: checking to see if all hosts have failed and the running result is not ok 8119 1726773023.83816: done checking to see if all hosts have failed 8119 1726773023.83818: getting the remaining hosts for this loop 8119 1726773023.83821: done getting the remaining hosts for this loop 8119 1726773023.83829: building list of next tasks for hosts 8119 1726773023.83832: getting the next task for host managed_node2 8119 1726773023.83839: done getting next task for host managed_node2 8119 1726773023.83842: ^ task is: TASK: Ensure required services are enabled and started 8119 1726773023.83845: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773023.83848: done building task lists 8119 1726773023.83850: counting tasks in each state of execution 8119 1726773023.83853: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773023.83856: advancing hosts in ITERATING_TASKS 8119 1726773023.83858: starting to advance hosts 8119 1726773023.83860: getting the next task for host managed_node2 8119 1726773023.83863: done getting next task for host managed_node2 8119 1726773023.83866: ^ task is: TASK: Ensure required services are enabled and started 8119 1726773023.83868: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773023.83871: done advancing hosts to next task 8119 1726773023.83937: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773023.83943: getting variables 8119 1726773023.83947: in VariableManager get_vars() 8119 1726773023.83977: Calling all_inventory to load vars for managed_node2 8119 1726773023.83985: Calling groups_inventory to load vars for managed_node2 8119 1726773023.83990: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773023.84021: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84037: Calling all_plugins_play to load vars for managed_node2 8119 1726773023.84055: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84070: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773023.84091: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84103: Calling groups_plugins_play to load vars for managed_node2 8119 1726773023.84120: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84153: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84178: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773023.84502: done with get_vars() 8119 1726773023.84515: done getting variables 8119 1726773023.84521: sending task start callback, copying the task so we can template it temporarily 8119 1726773023.84524: done copying, going to template now 8119 1726773023.84527: done templating 8119 1726773023.84529: here goes the callback... TASK [Ensure required services are enabled and started] ************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 Thursday 19 September 2024 15:10:23 -0400 (0:00:00.830) 0:00:18.401 **** 8119 1726773023.84550: sending task start callback 8119 1726773023.84553: entering _queue_task() for managed_node2/service 8119 1726773023.84556: Creating lock for service 8119 1726773023.84719: worker is 1 (out of 1 available) 8119 1726773023.84754: exiting _queue_task() for managed_node2/service 8119 1726773023.84830: done queuing things up, now waiting for results queue to drain 8119 1726773023.84836: waiting for pending results... 8480 1726773023.85052: running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started 8480 1726773023.85112: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000012 8480 1726773023.85170: calling self._execute() 8480 1726773023.85430: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8480 1726773023.85488: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8480 1726773023.85505: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8480 1726773023.85523: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8480 1726773023.85533: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8480 1726773023.85702: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8480 1726773023.85720: starting attempt loop 8480 1726773023.85723: running the handler 8480 1726773023.87993: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8480 1726773023.88131: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8480 1726773023.88214: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8480 1726773023.88254: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8480 1726773023.88296: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8480 1726773023.88336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8480 1726773023.88396: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8480 1726773023.88428: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8480 1726773023.88453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8480 1726773023.88565: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8480 1726773023.88592: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8480 1726773023.88615: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8480 1726773023.88873: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity 8480 1726773023.88891: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory 8480 1726773023.88902: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging 8480 1726773023.88909: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring 8480 1726773023.88961: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools 8480 1726773023.88998: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network 8480 1726773023.89049: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification 8480 1726773023.89101: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging 8480 1726773023.89110: trying /usr/local/lib/python3.9/site-packages/ansible/modules/remote_management 8480 1726773023.89129: trying /usr/local/lib/python3.9/site-packages/ansible/modules/source_control 8480 1726773023.89168: trying /usr/local/lib/python3.9/site-packages/ansible/modules/storage 8480 1726773023.89180: trying /usr/local/lib/python3.9/site-packages/ansible/modules/system 8480 1726773023.89326: _low_level_execute_command(): starting 8480 1726773023.89333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8480 1726773023.92049: stdout chunk (state=2): >>>/root <<< 8480 1726773023.92191: stderr chunk (state=3): >>><<< 8480 1726773023.92199: stdout chunk (state=3): >>><<< 8480 1726773023.92225: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8480 1726773023.92245: _low_level_execute_command(): starting 8480 1726773023.92253: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600 `" && echo ansible-tmp-1726773023.9223714-8480-136417543237600="` echo /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600 `" ) && sleep 0' 8480 1726773023.95109: stdout chunk (state=2): >>>ansible-tmp-1726773023.9223714-8480-136417543237600=/root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600 <<< 8480 1726773023.95247: stderr chunk (state=3): >>><<< 8480 1726773023.95259: stdout chunk (state=3): >>><<< 8480 1726773023.95279: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773023.9223714-8480-136417543237600=/root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600 , stderr= 8480 1726773023.95401: ANSIBALLZ: Using generic lock for systemd 8480 1726773023.95406: ANSIBALLZ: Acquiring lock 8480 1726773023.95411: ANSIBALLZ: Lock acquired: 140408695168400 8480 1726773023.95414: ANSIBALLZ: Creating module 8480 1726773024.24699: ANSIBALLZ: Writing module into payload 8480 1726773024.24932: ANSIBALLZ: Writing module 8480 1726773024.24962: ANSIBALLZ: Renaming module 8480 1726773024.24968: ANSIBALLZ: Done creating module 8480 1726773024.25024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/AnsiballZ_systemd.py 8480 1726773024.26532: Sending initial data 8480 1726773024.26545: Sent initial data (154 bytes) 8480 1726773024.29258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpc7zp4j_0 /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/AnsiballZ_systemd.py <<< 8480 1726773024.31542: stderr chunk (state=3): >>><<< 8480 1726773024.31551: stdout chunk (state=3): >>><<< 8480 1726773024.31586: done transferring module to remote 8480 1726773024.31610: _low_level_execute_command(): starting 8480 1726773024.31618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/ /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/AnsiballZ_systemd.py && sleep 0' 8480 1726773024.35694: stderr chunk (state=2): >>><<< 8480 1726773024.35708: stdout chunk (state=2): >>><<< 8480 1726773024.35728: _low_level_execute_command() done: rc=0, stdout=, stderr= 8480 1726773024.35732: _low_level_execute_command(): starting 8480 1726773024.35739: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/AnsiballZ_systemd.py && sleep 0' 8480 1726773024.99197: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:44 EDT", "WatchdogTimestampMonotonic": "23025174", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ExecMainStartTimestampMonotonic": "22083755", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:43 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18612224", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:44 EDT", "StateChangeTimestampMonotonic": "23025176", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:43 EDT", "InactiveExitTimestampMonotonic": "22083791", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ActiveEnterTimestampMonotonic": "23025176", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ConditionTimestampMonotonic": "22082777", "AssertTimestamp": "Thu 2024-09-19 15:03:43 EDT", "AssertTimestampMonotonic": "22082779", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e66c27dd42644f6b88bfc7de37ca7ab0", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 8480 1726773025.00755: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8480 1726773025.00766: stdout chunk (state=3): >>><<< 8480 1726773025.00778: stderr chunk (state=3): >>><<< 8480 1726773025.00800: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:44 EDT", "WatchdogTimestampMonotonic": "23025174", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ExecMainStartTimestampMonotonic": "22083755", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:43 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18612224", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:44 EDT", "StateChangeTimestampMonotonic": "23025176", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:43 EDT", "InactiveExitTimestampMonotonic": "22083791", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ActiveEnterTimestampMonotonic": "23025176", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ConditionTimestampMonotonic": "22082777", "AssertTimestamp": "Thu 2024-09-19 15:03:43 EDT", "AssertTimestampMonotonic": "22082779", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e66c27dd42644f6b88bfc7de37ca7ab0", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8480 1726773025.01176: done with _execute_module (systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8480 1726773025.01191: _low_level_execute_command(): starting 8480 1726773025.01199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773023.9223714-8480-136417543237600/ > /dev/null 2>&1 && sleep 0' 8480 1726773025.04145: stderr chunk (state=2): >>><<< 8480 1726773025.04160: stdout chunk (state=2): >>><<< 8480 1726773025.04186: _low_level_execute_command() done: rc=0, stdout=, stderr= 8480 1726773025.04203: handler run complete 8480 1726773025.04210: attempt loop complete, returning result 8480 1726773025.04222: _execute() done 8480 1726773025.04225: dumping result to json 8480 1726773025.04250: done dumping result, returning 8480 1726773025.04265: done running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-000000000012] 8480 1726773025.04286: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000012 8480 1726773025.04340: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000012 8480 1726773025.04346: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "enabled": true, "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ActiveEnterTimestampMonotonic": "23025176", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:03:43 EDT", "AssertTimestampMonotonic": "22082779", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ConditionTimestampMonotonic": "22082777", "ConfigurationDirectoryMode": "0755", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:43 EDT", "ExecMainStartTimestampMonotonic": "22083755", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:43 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:43 EDT", "InactiveExitTimestampMonotonic": "22083791", "InvocationID": "e66c27dd42644f6b88bfc7de37ca7ab0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "18612224", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:03:44 EDT", "StateChangeTimestampMonotonic": "23025176", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:03:44 EDT", "WatchdogTimestampMonotonic": "23025174", "WatchdogUSec": "0" } } 8119 1726773025.05313: no more pending results, returning what we have 8119 1726773025.05320: results queue empty 8119 1726773025.05323: checking for any_errors_fatal 8119 1726773025.05328: done checking for any_errors_fatal 8119 1726773025.05330: checking for max_fail_percentage 8119 1726773025.05333: done checking for max_fail_percentage 8119 1726773025.05336: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.05338: done checking to see if all hosts have failed 8119 1726773025.05340: getting the remaining hosts for this loop 8119 1726773025.05343: done getting the remaining hosts for this loop 8119 1726773025.05352: building list of next tasks for hosts 8119 1726773025.05355: getting the next task for host managed_node2 8119 1726773025.05362: done getting next task for host managed_node2 8119 1726773025.05366: ^ task is: TASK: Apply kernel_settings 8119 1726773025.05369: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.05371: done building task lists 8119 1726773025.05374: counting tasks in each state of execution 8119 1726773025.05377: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.05380: advancing hosts in ITERATING_TASKS 8119 1726773025.05382: starting to advance hosts 8119 1726773025.05387: getting the next task for host managed_node2 8119 1726773025.05390: done getting next task for host managed_node2 8119 1726773025.05393: ^ task is: TASK: Apply kernel_settings 8119 1726773025.05395: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.05397: done advancing hosts to next task 8119 1726773025.05413: getting variables 8119 1726773025.05416: in VariableManager get_vars() 8119 1726773025.05446: Calling all_inventory to load vars for managed_node2 8119 1726773025.05454: Calling groups_inventory to load vars for managed_node2 8119 1726773025.05458: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.05489: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05506: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.05525: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05540: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.05558: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05569: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.05587: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05619: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05644: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.05977: done with get_vars() 8119 1726773025.05992: done getting variables 8119 1726773025.05999: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.06001: done copying, going to template now 8119 1726773025.06004: done templating 8119 1726773025.06006: here goes the callback... TASK [Apply kernel_settings] *************************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:57 Thursday 19 September 2024 15:10:25 -0400 (0:00:01.214) 0:00:19.616 **** 8119 1726773025.06028: sending task start callback 8119 1726773025.06031: entering _queue_task() for managed_node2/include_role 8119 1726773025.06034: Creating lock for include_role 8119 1726773025.06218: worker is 1 (out of 1 available) 8119 1726773025.06254: exiting _queue_task() for managed_node2/include_role 8119 1726773025.06321: done queuing things up, now waiting for results queue to drain 8119 1726773025.06326: waiting for pending results... 8542 1726773025.06563: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings 8542 1726773025.06620: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000013 8542 1726773025.06669: calling self._execute() 8542 1726773025.06800: _execute() done 8542 1726773025.06806: dumping result to json 8542 1726773025.06809: done dumping result, returning 8542 1726773025.06814: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings [12a3200b-1e9d-1dbd-cc52-000000000013] 8542 1726773025.06826: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000013 8542 1726773025.07098: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000013 8542 1726773025.07103: WORKER PROCESS EXITING 8119 1726773025.07422: no more pending results, returning what we have 8119 1726773025.07431: in VariableManager get_vars() 8119 1726773025.07464: Calling all_inventory to load vars for managed_node2 8119 1726773025.07471: Calling groups_inventory to load vars for managed_node2 8119 1726773025.07475: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.07508: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07524: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.07542: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07555: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.07571: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07580: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.07599: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07629: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07653: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.07929: done with get_vars() 8119 1726773025.08367: we have included files to process 8119 1726773025.08371: generating all_blocks data 8119 1726773025.08374: done generating all_blocks data 8119 1726773025.08377: processing included file: fedora.linux_system_roles.kernel_settings 8119 1726773025.08395: in VariableManager get_vars() 8119 1726773025.08411: done with get_vars() 8119 1726773025.08482: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8119 1726773025.08581: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8119 1726773025.08636: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8119 1726773025.08740: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8119 1726773025.09255: in VariableManager get_vars() 8119 1726773025.09279: done with get_vars() 8119 1726773025.09434: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.09505: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.09605: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.09650: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.09673: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity 8119 1726773025.09690: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory 8119 1726773025.09702: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging 8119 1726773025.09707: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring 8119 1726773025.09749: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools 8119 1726773025.09770: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network 8119 1726773025.09816: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification 8119 1726773025.09857: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging 8119 1726773025.09868: trying /usr/local/lib/python3.9/site-packages/ansible/modules/remote_management 8119 1726773025.09881: trying /usr/local/lib/python3.9/site-packages/ansible/modules/source_control 8119 1726773025.09907: trying /usr/local/lib/python3.9/site-packages/ansible/modules/storage 8119 1726773025.09922: trying /usr/local/lib/python3.9/site-packages/ansible/modules/system 8119 1726773025.10032: trying /usr/local/lib/python3.9/site-packages/ansible/modules/utilities 8119 1726773025.10044: trying /usr/local/lib/python3.9/site-packages/ansible/modules/web_infrastructure 8119 1726773025.10080: trying /usr/local/lib/python3.9/site-packages/ansible/modules/__pycache__ 8119 1726773025.10092: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/alicloud 8119 1726773025.10102: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/amazon 8119 1726773025.10479: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/atomic 8119 1726773025.10495: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/azure 8119 1726773025.10914: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/centurylink 8119 1726773025.10939: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudscale 8119 1726773025.10950: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudstack 8119 1726773025.11017: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/digital_ocean 8119 1726773025.11054: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/dimensiondata 8119 1726773025.11065: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/docker 8119 1726773025.11107: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/google 8119 1726773025.11335: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/hcloud 8119 1726773025.11363: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/heroku 8119 1726773025.11368: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/huawei 8119 1726773025.11373: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/kubevirt 8119 1726773025.11382: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/linode 8119 1726773025.11390: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxc 8119 1726773025.11395: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxd 8119 1726773025.11400: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/memset 8119 1726773025.11412: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/misc 8119 1726773025.11433: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oneandone 8119 1726773025.11444: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/online 8119 1726773025.11451: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/opennebula 8119 1726773025.11460: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/openstack 8119 1726773025.11523: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oracle 8119 1726773025.11529: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovh 8119 1726773025.11537: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovirt 8119 1726773025.11644: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/packet 8119 1726773025.11655: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/podman 8119 1726773025.11665: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/profitbricks 8119 1726773025.11679: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/pubnub 8119 1726773025.11688: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/rackspace 8119 1726773025.11732: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/scaleway 8119 1726773025.11771: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/smartos 8119 1726773025.11785: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/softlayer 8119 1726773025.11806: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/spotinst 8119 1726773025.11813: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/univention 8119 1726773025.11824: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vmware 8119 1726773025.12028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vultr 8119 1726773025.12091: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/webfaction 8119 1726773025.12105: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/xenserver 8119 1726773025.12117: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/__pycache__ 8119 1726773025.12123: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/alicloud/__pycache__ 8119 1726773025.12131: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/amazon/__pycache__ 8119 1726773025.12372: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/atomic/__pycache__ 8119 1726773025.12385: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/azure/__pycache__ 8119 1726773025.12621: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/centurylink/__pycache__ 8119 1726773025.12639: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudscale/__pycache__ 8119 1726773025.12651: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/cloudstack/__pycache__ 8119 1726773025.12705: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/digital_ocean/__pycache__ 8119 1726773025.12732: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/dimensiondata/__pycache__ 8119 1726773025.12741: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/docker/__pycache__ 8119 1726773025.12771: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/google/__pycache__ 8119 1726773025.12910: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/hcloud/__pycache__ 8119 1726773025.12930: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/heroku/__pycache__ 8119 1726773025.12934: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/huawei/__pycache__ 8119 1726773025.12939: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/kubevirt/__pycache__ 8119 1726773025.12945: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/linode/__pycache__ 8119 1726773025.12950: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxc/__pycache__ 8119 1726773025.12954: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/lxd/__pycache__ 8119 1726773025.12958: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/memset/__pycache__ 8119 1726773025.12966: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/misc/__pycache__ 8119 1726773025.12980: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oneandone/__pycache__ 8119 1726773025.12992: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/online/__pycache__ 8119 1726773025.12999: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/opennebula/__pycache__ 8119 1726773025.13005: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/openstack/__pycache__ 8119 1726773025.13045: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/oracle/__pycache__ 8119 1726773025.13049: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovh/__pycache__ 8119 1726773025.13053: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/ovirt/__pycache__ 8119 1726773025.13107: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/packet/__pycache__ 8119 1726773025.13115: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/podman/__pycache__ 8119 1726773025.13120: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/profitbricks/__pycache__ 8119 1726773025.13127: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/pubnub/__pycache__ 8119 1726773025.13131: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/rackspace/__pycache__ 8119 1726773025.13148: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/scaleway/__pycache__ 8119 1726773025.13164: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/smartos/__pycache__ 8119 1726773025.13170: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/softlayer/__pycache__ 8119 1726773025.13174: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/spotinst/__pycache__ 8119 1726773025.13177: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/univention/__pycache__ 8119 1726773025.13185: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vmware/__pycache__ 8119 1726773025.13285: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/vultr/__pycache__ 8119 1726773025.13319: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/webfaction/__pycache__ 8119 1726773025.13329: trying /usr/local/lib/python3.9/site-packages/ansible/modules/cloud/xenserver/__pycache__ 8119 1726773025.13337: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/k8s 8119 1726773025.13349: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/openshift 8119 1726773025.13356: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/__pycache__ 8119 1726773025.13363: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/k8s/__pycache__ 8119 1726773025.13371: trying /usr/local/lib/python3.9/site-packages/ansible/modules/clustering/openshift/__pycache__ 8119 1726773025.13375: trying /usr/local/lib/python3.9/site-packages/ansible/modules/commands/__pycache__ 8119 1726773025.13384: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/acme 8119 1726773025.13396: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/entrust 8119 1726773025.13401: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/__pycache__ 8119 1726773025.13413: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/acme/__pycache__ 8119 1726773025.13421: trying /usr/local/lib/python3.9/site-packages/ansible/modules/crypto/entrust/__pycache__ 8119 1726773025.13426: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/aerospike 8119 1726773025.13433: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/influxdb 8119 1726773025.13443: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/misc 8119 1726773025.13452: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mongodb 8119 1726773025.13459: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mssql 8119 1726773025.13464: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mysql 8119 1726773025.13472: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/postgresql 8119 1726773025.13495: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/proxysql 8119 1726773025.13506: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/vertica 8119 1726773025.13516: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/__pycache__ 8119 1726773025.13520: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/aerospike/__pycache__ 8119 1726773025.13524: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/influxdb/__pycache__ 8119 1726773025.13530: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/misc/__pycache__ 8119 1726773025.13536: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mongodb/__pycache__ 8119 1726773025.13544: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mssql/__pycache__ 8119 1726773025.13549: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/mysql/__pycache__ 8119 1726773025.13557: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/postgresql/__pycache__ 8119 1726773025.13573: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/proxysql/__pycache__ 8119 1726773025.13581: trying /usr/local/lib/python3.9/site-packages/ansible/modules/database/vertica/__pycache__ 8119 1726773025.13590: trying /usr/local/lib/python3.9/site-packages/ansible/modules/files/__pycache__ 8119 1726773025.13605: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/cyberark 8119 1726773025.13612: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/ipa 8119 1726773025.13629: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/keycloak 8119 1726773025.13636: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/opendj 8119 1726773025.13640: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/__pycache__ 8119 1726773025.13644: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/cyberark/__pycache__ 8119 1726773025.13649: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/ipa/__pycache__ 8119 1726773025.13663: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/keycloak/__pycache__ 8119 1726773025.13671: trying /usr/local/lib/python3.9/site-packages/ansible/modules/identity/opendj/__pycache__ 8119 1726773025.13676: trying /usr/local/lib/python3.9/site-packages/ansible/modules/inventory/__pycache__ 8119 1726773025.13681: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/rabbitmq 8119 1726773025.13696: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/__pycache__ 8119 1726773025.13699: trying /usr/local/lib/python3.9/site-packages/ansible/modules/messaging/rabbitmq/__pycache__ 8119 1726773025.13709: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/zabbix 8119 1726773025.13726: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/__pycache__ 8119 1726773025.13747: trying /usr/local/lib/python3.9/site-packages/ansible/modules/monitoring/zabbix/__pycache__ 8119 1726773025.13758: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics 8119 1726773025.13905: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.14053: in VariableManager get_vars() 8119 1726773025.14079: done with get_vars() 8119 1726773025.14160: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8119 1726773025.14561: iterating over new_blocks loaded from include file 8119 1726773025.14566: in VariableManager get_vars() 8119 1726773025.14616: done with get_vars() 8119 1726773025.14621: filtering new block on tags 8119 1726773025.14698: done filtering new block on tags 8119 1726773025.14712: in VariableManager get_vars() 8119 1726773025.14737: done with get_vars() 8119 1726773025.14741: filtering new block on tags 8119 1726773025.14802: done filtering new block on tags 8119 1726773025.14815: in VariableManager get_vars() 8119 1726773025.14837: done with get_vars() 8119 1726773025.14841: filtering new block on tags 8119 1726773025.15077: done filtering new block on tags 8119 1726773025.15089: done iterating over new_blocks loaded from include file 8119 1726773025.15093: extending task lists for all hosts with included blocks 8119 1726773025.15699: done extending task lists 8119 1726773025.15704: done processing included files 8119 1726773025.15707: results queue empty 8119 1726773025.15709: checking for any_errors_fatal 8119 1726773025.15716: done checking for any_errors_fatal 8119 1726773025.15718: checking for max_fail_percentage 8119 1726773025.15721: done checking for max_fail_percentage 8119 1726773025.15723: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.15725: done checking to see if all hosts have failed 8119 1726773025.15727: getting the remaining hosts for this loop 8119 1726773025.15730: done getting the remaining hosts for this loop 8119 1726773025.15736: building list of next tasks for hosts 8119 1726773025.15739: getting the next task for host managed_node2 8119 1726773025.15745: done getting next task for host managed_node2 8119 1726773025.15749: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773025.15753: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.15755: done building task lists 8119 1726773025.15757: counting tasks in each state of execution 8119 1726773025.15761: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.15763: advancing hosts in ITERATING_TASKS 8119 1726773025.15766: starting to advance hosts 8119 1726773025.15768: getting the next task for host managed_node2 8119 1726773025.15771: done getting next task for host managed_node2 8119 1726773025.15773: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773025.15775: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.15776: done advancing hosts to next task 8119 1726773025.15820: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773025.15825: getting variables 8119 1726773025.15827: in VariableManager get_vars() 8119 1726773025.15844: Calling all_inventory to load vars for managed_node2 8119 1726773025.15849: Calling groups_inventory to load vars for managed_node2 8119 1726773025.15853: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.15882: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.15895: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.15907: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.15917: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.15928: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.15934: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.15943: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.15964: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.15979: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.16173: done with get_vars() 8119 1726773025.16187: done getting variables 8119 1726773025.16193: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.16194: done copying, going to template now 8119 1726773025.16196: done templating 8119 1726773025.16197: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.101) 0:00:19.718 **** 8119 1726773025.16214: sending task start callback 8119 1726773025.16218: entering _queue_task() for managed_node2/fail 8119 1726773025.16220: Creating lock for fail 8119 1726773025.16369: worker is 1 (out of 1 available) 8119 1726773025.16410: exiting _queue_task() for managed_node2/fail 8119 1726773025.16484: done queuing things up, now waiting for results queue to drain 8119 1726773025.16490: waiting for pending results... 8548 1726773025.16552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8548 1726773025.16604: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000a6 8548 1726773025.16658: calling self._execute() 8548 1726773025.18401: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8548 1726773025.18477: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8548 1726773025.18584: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8548 1726773025.18623: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8548 1726773025.18652: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8548 1726773025.18678: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8548 1726773025.18738: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8548 1726773025.18771: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8548 1726773025.18798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8548 1726773025.18924: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8548 1726773025.18944: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8548 1726773025.18962: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8548 1726773025.19856: when evaluation is False, skipping this task 8548 1726773025.19861: _execute() done 8548 1726773025.19863: dumping result to json 8548 1726773025.19864: done dumping result, returning 8548 1726773025.19869: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [12a3200b-1e9d-1dbd-cc52-0000000000a6] 8548 1726773025.19877: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a6 8548 1726773025.19902: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a6 8548 1726773025.19955: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773025.20132: no more pending results, returning what we have 8119 1726773025.20136: results queue empty 8119 1726773025.20140: checking for any_errors_fatal 8119 1726773025.20146: done checking for any_errors_fatal 8119 1726773025.20149: checking for max_fail_percentage 8119 1726773025.20151: done checking for max_fail_percentage 8119 1726773025.20153: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.20155: done checking to see if all hosts have failed 8119 1726773025.20157: getting the remaining hosts for this loop 8119 1726773025.20160: done getting the remaining hosts for this loop 8119 1726773025.20167: building list of next tasks for hosts 8119 1726773025.20168: getting the next task for host managed_node2 8119 1726773025.20174: done getting next task for host managed_node2 8119 1726773025.20178: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773025.20181: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.20185: done building task lists 8119 1726773025.20187: counting tasks in each state of execution 8119 1726773025.20192: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.20194: advancing hosts in ITERATING_TASKS 8119 1726773025.20195: starting to advance hosts 8119 1726773025.20197: getting the next task for host managed_node2 8119 1726773025.20200: done getting next task for host managed_node2 8119 1726773025.20202: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773025.20204: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.20206: done advancing hosts to next task 8119 1726773025.20217: getting variables 8119 1726773025.20220: in VariableManager get_vars() 8119 1726773025.20245: Calling all_inventory to load vars for managed_node2 8119 1726773025.20248: Calling groups_inventory to load vars for managed_node2 8119 1726773025.20250: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.20270: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20281: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.20294: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20306: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.20321: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20328: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.20338: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20356: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20369: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.20589: done with get_vars() 8119 1726773025.20599: done getting variables 8119 1726773025.20603: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.20605: done copying, going to template now 8119 1726773025.20607: done templating 8119 1726773025.20609: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.044) 0:00:19.762 **** 8119 1726773025.20628: sending task start callback 8119 1726773025.20630: entering _queue_task() for managed_node2/include_tasks 8119 1726773025.20632: Creating lock for include_tasks 8119 1726773025.20770: worker is 1 (out of 1 available) 8119 1726773025.20812: exiting _queue_task() for managed_node2/include_tasks 8119 1726773025.20882: done queuing things up, now waiting for results queue to drain 8119 1726773025.20889: waiting for pending results... 8551 1726773025.20943: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8551 1726773025.21002: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000a7 8551 1726773025.21050: calling self._execute() 8551 1726773025.21150: _execute() done 8551 1726773025.21154: dumping result to json 8551 1726773025.21156: done dumping result, returning 8551 1726773025.21160: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [12a3200b-1e9d-1dbd-cc52-0000000000a7] 8551 1726773025.21169: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a7 8551 1726773025.21222: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a7 8551 1726773025.21272: WORKER PROCESS EXITING 8119 1726773025.21417: no more pending results, returning what we have 8119 1726773025.21423: in VariableManager get_vars() 8119 1726773025.21459: Calling all_inventory to load vars for managed_node2 8119 1726773025.21464: Calling groups_inventory to load vars for managed_node2 8119 1726773025.21467: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.21518: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21530: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.21541: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21549: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.21564: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21575: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.21588: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21609: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21624: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.21837: done with get_vars() 8119 1726773025.21874: we have included files to process 8119 1726773025.21877: generating all_blocks data 8119 1726773025.21879: done generating all_blocks data 8119 1726773025.21882: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773025.21887: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773025.21891: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773025.22107: plugin lookup for setup failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.22203: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773025.22289: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8119 1726773025.22453: done processing included file 8119 1726773025.22456: iterating over new_blocks loaded from include file 8119 1726773025.22458: in VariableManager get_vars() 8119 1726773025.22488: done with get_vars() 8119 1726773025.22492: filtering new block on tags 8119 1726773025.22569: done filtering new block on tags 8119 1726773025.22585: in VariableManager get_vars() 8119 1726773025.22613: done with get_vars() 8119 1726773025.22617: filtering new block on tags 8119 1726773025.22687: done filtering new block on tags 8119 1726773025.22699: in VariableManager get_vars() 8119 1726773025.22725: done with get_vars() 8119 1726773025.22729: filtering new block on tags 8119 1726773025.22802: done filtering new block on tags 8119 1726773025.22814: in VariableManager get_vars() 8119 1726773025.22841: done with get_vars() 8119 1726773025.22845: filtering new block on tags 8119 1726773025.22904: done filtering new block on tags 8119 1726773025.22916: done iterating over new_blocks loaded from include file 8119 1726773025.22919: extending task lists for all hosts with included blocks 8119 1726773025.23052: done extending task lists 8119 1726773025.23057: done processing included files 8119 1726773025.23060: results queue empty 8119 1726773025.23062: checking for any_errors_fatal 8119 1726773025.23066: done checking for any_errors_fatal 8119 1726773025.23069: checking for max_fail_percentage 8119 1726773025.23071: done checking for max_fail_percentage 8119 1726773025.23073: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.23075: done checking to see if all hosts have failed 8119 1726773025.23077: getting the remaining hosts for this loop 8119 1726773025.23080: done getting the remaining hosts for this loop 8119 1726773025.23088: building list of next tasks for hosts 8119 1726773025.23092: getting the next task for host managed_node2 8119 1726773025.23098: done getting next task for host managed_node2 8119 1726773025.23101: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773025.23106: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.23108: done building task lists 8119 1726773025.23110: counting tasks in each state of execution 8119 1726773025.23115: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.23117: advancing hosts in ITERATING_TASKS 8119 1726773025.23119: starting to advance hosts 8119 1726773025.23122: getting the next task for host managed_node2 8119 1726773025.23126: done getting next task for host managed_node2 8119 1726773025.23130: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773025.23133: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.23135: done advancing hosts to next task 8119 1726773025.23143: getting variables 8119 1726773025.23146: in VariableManager get_vars() 8119 1726773025.23163: Calling all_inventory to load vars for managed_node2 8119 1726773025.23168: Calling groups_inventory to load vars for managed_node2 8119 1726773025.23172: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.23213: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23226: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.23244: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23258: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.23276: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23290: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.23308: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23338: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23362: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.23647: done with get_vars() 8119 1726773025.23659: done getting variables 8119 1726773025.23665: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.23667: done copying, going to template now 8119 1726773025.23669: done templating 8119 1726773025.23671: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.030) 0:00:19.793 **** 8119 1726773025.23694: sending task start callback 8119 1726773025.23698: entering _queue_task() for managed_node2/setup 8119 1726773025.23845: worker is 1 (out of 1 available) 8119 1726773025.23886: exiting _queue_task() for managed_node2/setup 8119 1726773025.23958: done queuing things up, now waiting for results queue to drain 8119 1726773025.23963: waiting for pending results... 8554 1726773025.24107: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8554 1726773025.24184: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000158 8554 1726773025.24243: calling self._execute() 8554 1726773025.26104: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8554 1726773025.26199: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8554 1726773025.26260: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8554 1726773025.26294: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8554 1726773025.26324: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8554 1726773025.26351: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8554 1726773025.26400: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8554 1726773025.26434: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8554 1726773025.26452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8554 1726773025.26546: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8554 1726773025.26563: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8554 1726773025.26576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8554 1726773025.27009: when evaluation is False, skipping this task 8554 1726773025.27014: _execute() done 8554 1726773025.27016: dumping result to json 8554 1726773025.27018: done dumping result, returning 8554 1726773025.27022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [12a3200b-1e9d-1dbd-cc52-000000000158] 8554 1726773025.27030: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000158 8554 1726773025.27099: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000158 8554 1726773025.27104: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773025.27560: no more pending results, returning what we have 8119 1726773025.27565: results queue empty 8119 1726773025.27567: checking for any_errors_fatal 8119 1726773025.27571: done checking for any_errors_fatal 8119 1726773025.27573: checking for max_fail_percentage 8119 1726773025.27576: done checking for max_fail_percentage 8119 1726773025.27578: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.27581: done checking to see if all hosts have failed 8119 1726773025.27585: getting the remaining hosts for this loop 8119 1726773025.27588: done getting the remaining hosts for this loop 8119 1726773025.27597: building list of next tasks for hosts 8119 1726773025.27600: getting the next task for host managed_node2 8119 1726773025.27613: done getting next task for host managed_node2 8119 1726773025.27618: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773025.27623: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.27626: done building task lists 8119 1726773025.27628: counting tasks in each state of execution 8119 1726773025.27632: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.27635: advancing hosts in ITERATING_TASKS 8119 1726773025.27637: starting to advance hosts 8119 1726773025.27639: getting the next task for host managed_node2 8119 1726773025.27646: done getting next task for host managed_node2 8119 1726773025.27649: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773025.27652: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.27655: done advancing hosts to next task 8119 1726773025.27669: getting variables 8119 1726773025.27672: in VariableManager get_vars() 8119 1726773025.27712: Calling all_inventory to load vars for managed_node2 8119 1726773025.27719: Calling groups_inventory to load vars for managed_node2 8119 1726773025.27723: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.27751: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.27766: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.27786: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.27802: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.27821: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.27831: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.27847: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.27878: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.27906: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.28180: done with get_vars() 8119 1726773025.28195: done getting variables 8119 1726773025.28203: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.28205: done copying, going to template now 8119 1726773025.28209: done templating 8119 1726773025.28211: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.045) 0:00:19.838 **** 8119 1726773025.28227: sending task start callback 8119 1726773025.28229: entering _queue_task() for managed_node2/stat 8119 1726773025.28354: worker is 1 (out of 1 available) 8119 1726773025.28395: exiting _queue_task() for managed_node2/stat 8119 1726773025.28468: done queuing things up, now waiting for results queue to drain 8119 1726773025.28473: waiting for pending results... 8557 1726773025.28520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8557 1726773025.28575: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000015a 8557 1726773025.28622: calling self._execute() 8557 1726773025.30398: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8557 1726773025.30791: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8557 1726773025.30857: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8557 1726773025.30903: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8557 1726773025.30945: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8557 1726773025.30980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8557 1726773025.31043: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8557 1726773025.31076: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8557 1726773025.31103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8557 1726773025.31234: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8557 1726773025.31258: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8557 1726773025.31277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8557 1726773025.31643: when evaluation is False, skipping this task 8557 1726773025.31650: _execute() done 8557 1726773025.31652: dumping result to json 8557 1726773025.31655: done dumping result, returning 8557 1726773025.31661: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [12a3200b-1e9d-1dbd-cc52-00000000015a] 8557 1726773025.31673: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015a 8557 1726773025.31864: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015a 8557 1726773025.31868: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773025.32052: no more pending results, returning what we have 8119 1726773025.32060: results queue empty 8119 1726773025.32062: checking for any_errors_fatal 8119 1726773025.32067: done checking for any_errors_fatal 8119 1726773025.32069: checking for max_fail_percentage 8119 1726773025.32072: done checking for max_fail_percentage 8119 1726773025.32075: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.32077: done checking to see if all hosts have failed 8119 1726773025.32078: getting the remaining hosts for this loop 8119 1726773025.32081: done getting the remaining hosts for this loop 8119 1726773025.32090: building list of next tasks for hosts 8119 1726773025.32093: getting the next task for host managed_node2 8119 1726773025.32101: done getting next task for host managed_node2 8119 1726773025.32106: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773025.32113: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.32116: done building task lists 8119 1726773025.32118: counting tasks in each state of execution 8119 1726773025.32122: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.32128: advancing hosts in ITERATING_TASKS 8119 1726773025.32130: starting to advance hosts 8119 1726773025.32133: getting the next task for host managed_node2 8119 1726773025.32138: done getting next task for host managed_node2 8119 1726773025.32141: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773025.32144: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.32146: done advancing hosts to next task 8119 1726773025.32158: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773025.32161: getting variables 8119 1726773025.32164: in VariableManager get_vars() 8119 1726773025.32192: Calling all_inventory to load vars for managed_node2 8119 1726773025.32196: Calling groups_inventory to load vars for managed_node2 8119 1726773025.32198: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.32222: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32233: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.32246: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32259: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.32272: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32279: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.32291: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32313: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32329: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.32548: done with get_vars() 8119 1726773025.32559: done getting variables 8119 1726773025.32563: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.32565: done copying, going to template now 8119 1726773025.32567: done templating 8119 1726773025.32568: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.043) 0:00:19.882 **** 8119 1726773025.32587: sending task start callback 8119 1726773025.32591: entering _queue_task() for managed_node2/set_fact 8119 1726773025.32711: worker is 1 (out of 1 available) 8119 1726773025.32751: exiting _queue_task() for managed_node2/set_fact 8119 1726773025.32828: done queuing things up, now waiting for results queue to drain 8119 1726773025.32833: waiting for pending results... 8561 1726773025.32885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8561 1726773025.32960: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000015b 8561 1726773025.33017: calling self._execute() 8561 1726773025.35256: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8561 1726773025.35377: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8561 1726773025.35453: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8561 1726773025.35496: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8561 1726773025.35555: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8561 1726773025.35592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8561 1726773025.35640: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8561 1726773025.35663: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8561 1726773025.35679: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8561 1726773025.35763: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8561 1726773025.35780: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8561 1726773025.35796: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8561 1726773025.36103: when evaluation is False, skipping this task 8561 1726773025.36109: _execute() done 8561 1726773025.36112: dumping result to json 8561 1726773025.36114: done dumping result, returning 8561 1726773025.36120: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-00000000015b] 8561 1726773025.36131: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015b 8561 1726773025.36165: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015b 8561 1726773025.36169: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773025.36362: no more pending results, returning what we have 8119 1726773025.36366: results queue empty 8119 1726773025.36368: checking for any_errors_fatal 8119 1726773025.36372: done checking for any_errors_fatal 8119 1726773025.36374: checking for max_fail_percentage 8119 1726773025.36379: done checking for max_fail_percentage 8119 1726773025.36381: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.36382: done checking to see if all hosts have failed 8119 1726773025.36385: getting the remaining hosts for this loop 8119 1726773025.36388: done getting the remaining hosts for this loop 8119 1726773025.36396: building list of next tasks for hosts 8119 1726773025.36398: getting the next task for host managed_node2 8119 1726773025.36413: done getting next task for host managed_node2 8119 1726773025.36419: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773025.36425: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.36428: done building task lists 8119 1726773025.36430: counting tasks in each state of execution 8119 1726773025.36435: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.36437: advancing hosts in ITERATING_TASKS 8119 1726773025.36439: starting to advance hosts 8119 1726773025.36442: getting the next task for host managed_node2 8119 1726773025.36448: done getting next task for host managed_node2 8119 1726773025.36452: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773025.36455: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.36458: done advancing hosts to next task 8119 1726773025.36473: getting variables 8119 1726773025.36477: in VariableManager get_vars() 8119 1726773025.36520: Calling all_inventory to load vars for managed_node2 8119 1726773025.36526: Calling groups_inventory to load vars for managed_node2 8119 1726773025.36529: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.36555: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.36570: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.36587: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.36601: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.36620: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.36630: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.36644: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.36671: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.36695: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.37023: done with get_vars() 8119 1726773025.37036: done getting variables 8119 1726773025.37043: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.37046: done copying, going to template now 8119 1726773025.37049: done templating 8119 1726773025.37051: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.044) 0:00:19.927 **** 8119 1726773025.37074: sending task start callback 8119 1726773025.37077: entering _queue_task() for managed_node2/stat 8119 1726773025.37214: worker is 1 (out of 1 available) 8119 1726773025.37253: exiting _queue_task() for managed_node2/stat 8119 1726773025.37327: done queuing things up, now waiting for results queue to drain 8119 1726773025.37332: waiting for pending results... 8565 1726773025.37542: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8565 1726773025.37624: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000015d 8565 1726773025.37677: calling self._execute() 8565 1726773025.40084: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8565 1726773025.40184: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8565 1726773025.40254: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8565 1726773025.40299: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8565 1726773025.40346: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8565 1726773025.40389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8565 1726773025.40659: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8565 1726773025.40692: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8565 1726773025.40719: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8565 1726773025.40817: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8565 1726773025.40838: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8565 1726773025.40855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8565 1726773025.41257: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8565 1726773025.41312: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8565 1726773025.41328: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8565 1726773025.41344: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8565 1726773025.41352: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8565 1726773025.41468: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8565 1726773025.41491: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8565 1726773025.41526: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8565 1726773025.41537: starting attempt loop 8565 1726773025.41540: running the handler 8565 1726773025.41551: _low_level_execute_command(): starting 8565 1726773025.41557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8565 1726773025.47265: stdout chunk (state=2): >>>/root <<< 8565 1726773025.47461: stderr chunk (state=3): >>><<< 8565 1726773025.47468: stdout chunk (state=3): >>><<< 8565 1726773025.47496: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8565 1726773025.47524: _low_level_execute_command(): starting 8565 1726773025.47533: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459 `" && echo ansible-tmp-1726773025.4751494-8565-98223315437459="` echo /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459 `" ) && sleep 0' 8565 1726773025.50536: stdout chunk (state=2): >>>ansible-tmp-1726773025.4751494-8565-98223315437459=/root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459 <<< 8565 1726773025.50685: stderr chunk (state=3): >>><<< 8565 1726773025.50693: stdout chunk (state=3): >>><<< 8565 1726773025.50718: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773025.4751494-8565-98223315437459=/root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459 , stderr= 8565 1726773025.50839: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8565 1726773025.50915: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/AnsiballZ_stat.py 8565 1726773025.51890: Sending initial data 8565 1726773025.51905: Sent initial data (150 bytes) 8565 1726773025.54294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpi5lkr9ay /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/AnsiballZ_stat.py <<< 8565 1726773025.55765: stderr chunk (state=3): >>><<< 8565 1726773025.55772: stdout chunk (state=3): >>><<< 8565 1726773025.55803: done transferring module to remote 8565 1726773025.55824: _low_level_execute_command(): starting 8565 1726773025.55832: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/ /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/AnsiballZ_stat.py && sleep 0' 8565 1726773025.58897: stderr chunk (state=2): >>><<< 8565 1726773025.58918: stdout chunk (state=2): >>><<< 8565 1726773025.58942: _low_level_execute_command() done: rc=0, stdout=, stderr= 8565 1726773025.58952: _low_level_execute_command(): starting 8565 1726773025.58961: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/AnsiballZ_stat.py && sleep 0' 8565 1726773025.73776: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8565 1726773025.75654: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8565 1726773025.75663: stdout chunk (state=3): >>><<< 8565 1726773025.75674: stderr chunk (state=3): >>><<< 8565 1726773025.75696: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 8565 1726773025.75730: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8565 1726773025.75747: _low_level_execute_command(): starting 8565 1726773025.75757: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773025.4751494-8565-98223315437459/ > /dev/null 2>&1 && sleep 0' 8565 1726773025.79652: stderr chunk (state=2): >>><<< 8565 1726773025.79667: stdout chunk (state=2): >>><<< 8565 1726773025.79694: _low_level_execute_command() done: rc=0, stdout=, stderr= 8565 1726773025.79704: handler run complete 8565 1726773025.79742: attempt loop complete, returning result 8565 1726773025.79757: _execute() done 8565 1726773025.79760: dumping result to json 8565 1726773025.79764: done dumping result, returning 8565 1726773025.79781: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [12a3200b-1e9d-1dbd-cc52-00000000015d] 8565 1726773025.79799: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015d 8565 1726773025.80891: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015d 8565 1726773025.80898: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8119 1726773025.81256: no more pending results, returning what we have 8119 1726773025.81263: results queue empty 8119 1726773025.81266: checking for any_errors_fatal 8119 1726773025.81271: done checking for any_errors_fatal 8119 1726773025.81273: checking for max_fail_percentage 8119 1726773025.81276: done checking for max_fail_percentage 8119 1726773025.81279: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.81281: done checking to see if all hosts have failed 8119 1726773025.81285: getting the remaining hosts for this loop 8119 1726773025.81288: done getting the remaining hosts for this loop 8119 1726773025.81297: building list of next tasks for hosts 8119 1726773025.81300: getting the next task for host managed_node2 8119 1726773025.81310: done getting next task for host managed_node2 8119 1726773025.81315: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773025.81321: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.81324: done building task lists 8119 1726773025.81326: counting tasks in each state of execution 8119 1726773025.81330: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.81332: advancing hosts in ITERATING_TASKS 8119 1726773025.81334: starting to advance hosts 8119 1726773025.81336: getting the next task for host managed_node2 8119 1726773025.81340: done getting next task for host managed_node2 8119 1726773025.81342: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773025.81346: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.81348: done advancing hosts to next task 8119 1726773025.81362: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773025.81366: getting variables 8119 1726773025.81369: in VariableManager get_vars() 8119 1726773025.81406: Calling all_inventory to load vars for managed_node2 8119 1726773025.81413: Calling groups_inventory to load vars for managed_node2 8119 1726773025.81417: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.81445: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81464: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.81485: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81503: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.81520: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81530: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.81546: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81572: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81594: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.81934: done with get_vars() 8119 1726773025.81948: done getting variables 8119 1726773025.81955: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.81958: done copying, going to template now 8119 1726773025.81961: done templating 8119 1726773025.81963: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.449) 0:00:20.376 **** 8119 1726773025.81991: sending task start callback 8119 1726773025.81995: entering _queue_task() for managed_node2/set_fact 8119 1726773025.82790: worker is 1 (out of 1 available) 8119 1726773025.82830: exiting _queue_task() for managed_node2/set_fact 8119 1726773025.82905: done queuing things up, now waiting for results queue to drain 8119 1726773025.82912: waiting for pending results... 8599 1726773025.83223: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8599 1726773025.83305: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000015e 8599 1726773025.83361: calling self._execute() 8599 1726773025.85534: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8599 1726773025.85615: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8599 1726773025.85669: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8599 1726773025.85701: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8599 1726773025.85729: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8599 1726773025.85761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8599 1726773025.85805: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8599 1726773025.85840: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8599 1726773025.85862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8599 1726773025.85981: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8599 1726773025.86005: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8599 1726773025.86023: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8599 1726773025.86425: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8599 1726773025.86430: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8599 1726773025.86432: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8599 1726773025.86434: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8599 1726773025.86436: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8599 1726773025.86439: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8599 1726773025.86442: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8599 1726773025.86445: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8599 1726773025.86447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8599 1726773025.86465: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8599 1726773025.86468: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8599 1726773025.86470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8599 1726773025.86513: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8599 1726773025.86548: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8599 1726773025.86565: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8599 1726773025.86580: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8599 1726773025.86589: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8599 1726773025.86711: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8599 1726773025.86726: starting attempt loop 8599 1726773025.86729: running the handler 8599 1726773025.86739: handler run complete 8599 1726773025.86743: attempt loop complete, returning result 8599 1726773025.86746: _execute() done 8599 1726773025.86749: dumping result to json 8599 1726773025.86754: done dumping result, returning 8599 1726773025.86760: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [12a3200b-1e9d-1dbd-cc52-00000000015e] 8599 1726773025.86769: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015e 8599 1726773025.86799: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000015e 8599 1726773025.86803: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8119 1726773025.86987: no more pending results, returning what we have 8119 1726773025.86992: results queue empty 8119 1726773025.86996: checking for any_errors_fatal 8119 1726773025.87001: done checking for any_errors_fatal 8119 1726773025.87004: checking for max_fail_percentage 8119 1726773025.87006: done checking for max_fail_percentage 8119 1726773025.87011: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.87013: done checking to see if all hosts have failed 8119 1726773025.87015: getting the remaining hosts for this loop 8119 1726773025.87017: done getting the remaining hosts for this loop 8119 1726773025.87025: building list of next tasks for hosts 8119 1726773025.87028: getting the next task for host managed_node2 8119 1726773025.87038: done getting next task for host managed_node2 8119 1726773025.87042: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773025.87047: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.87049: done building task lists 8119 1726773025.87051: counting tasks in each state of execution 8119 1726773025.87055: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.87057: advancing hosts in ITERATING_TASKS 8119 1726773025.87059: starting to advance hosts 8119 1726773025.87061: getting the next task for host managed_node2 8119 1726773025.87066: done getting next task for host managed_node2 8119 1726773025.87069: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773025.87072: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.87074: done advancing hosts to next task 8119 1726773025.87150: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773025.87156: getting variables 8119 1726773025.87160: in VariableManager get_vars() 8119 1726773025.87195: Calling all_inventory to load vars for managed_node2 8119 1726773025.87201: Calling groups_inventory to load vars for managed_node2 8119 1726773025.87205: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.87239: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87257: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.87275: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87291: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.87312: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87328: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.87345: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87378: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87403: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.87757: done with get_vars() 8119 1726773025.87772: done getting variables 8119 1726773025.87779: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.87781: done copying, going to template now 8119 1726773025.87784: done templating 8119 1726773025.87787: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.058) 0:00:20.434 **** 8119 1726773025.87811: sending task start callback 8119 1726773025.87815: entering _queue_task() for managed_node2/include_vars 8119 1726773025.87818: Creating lock for include_vars 8119 1726773025.87969: worker is 1 (out of 1 available) 8119 1726773025.88078: exiting _queue_task() for managed_node2/include_vars 8119 1726773025.88152: done queuing things up, now waiting for results queue to drain 8119 1726773025.88157: waiting for pending results... 8603 1726773025.88150: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8603 1726773025.88229: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000160 8603 1726773025.88285: calling self._execute() 8603 1726773025.90719: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8603 1726773025.90833: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8603 1726773025.90999: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8603 1726773025.91035: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8603 1726773025.91069: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8603 1726773025.91107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8603 1726773025.91152: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8603 1726773025.91175: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8603 1726773025.91216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8603 1726773025.91305: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8603 1726773025.91332: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8603 1726773025.91351: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8603 1726773025.92122: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup 8603 1726773025.92288: Loaded config def from plugin (lookup/first_found) 8603 1726773025.92294: Loading LookupModule 'first_found' from /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup/first_found.py 8603 1726773025.92348: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8603 1726773025.92388: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8603 1726773025.92401: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8603 1726773025.92420: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8603 1726773025.92428: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8603 1726773025.92522: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8603 1726773025.92530: starting attempt loop 8603 1726773025.92533: running the handler 8603 1726773025.92571: handler run complete 8603 1726773025.92575: attempt loop complete, returning result 8603 1726773025.92577: _execute() done 8603 1726773025.92579: dumping result to json 8603 1726773025.92582: done dumping result, returning 8603 1726773025.92589: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [12a3200b-1e9d-1dbd-cc52-000000000160] 8603 1726773025.92595: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000160 8603 1726773025.92624: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000160 8603 1726773025.92628: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8119 1726773025.92824: no more pending results, returning what we have 8119 1726773025.92829: results queue empty 8119 1726773025.92832: checking for any_errors_fatal 8119 1726773025.92836: done checking for any_errors_fatal 8119 1726773025.92838: checking for max_fail_percentage 8119 1726773025.92841: done checking for max_fail_percentage 8119 1726773025.92843: checking to see if all hosts have failed and the running result is not ok 8119 1726773025.92845: done checking to see if all hosts have failed 8119 1726773025.92847: getting the remaining hosts for this loop 8119 1726773025.92849: done getting the remaining hosts for this loop 8119 1726773025.92856: building list of next tasks for hosts 8119 1726773025.92859: getting the next task for host managed_node2 8119 1726773025.92873: done getting next task for host managed_node2 8119 1726773025.92878: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773025.92882: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.92886: done building task lists 8119 1726773025.92888: counting tasks in each state of execution 8119 1726773025.92892: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773025.92895: advancing hosts in ITERATING_TASKS 8119 1726773025.92897: starting to advance hosts 8119 1726773025.92899: getting the next task for host managed_node2 8119 1726773025.92904: done getting next task for host managed_node2 8119 1726773025.92907: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773025.92910: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773025.92912: done advancing hosts to next task 8119 1726773025.92927: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773025.92930: getting variables 8119 1726773025.92932: in VariableManager get_vars() 8119 1726773025.92957: Calling all_inventory to load vars for managed_node2 8119 1726773025.92960: Calling groups_inventory to load vars for managed_node2 8119 1726773025.92963: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773025.92997: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93016: Calling all_plugins_play to load vars for managed_node2 8119 1726773025.93033: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93046: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773025.93062: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93073: Calling groups_plugins_play to load vars for managed_node2 8119 1726773025.93091: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93122: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93146: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773025.93472: done with get_vars() 8119 1726773025.93486: done getting variables 8119 1726773025.93494: sending task start callback, copying the task so we can template it temporarily 8119 1726773025.93497: done copying, going to template now 8119 1726773025.93500: done templating 8119 1726773025.93502: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:10:25 -0400 (0:00:00.057) 0:00:20.491 **** 8119 1726773025.93526: sending task start callback 8119 1726773025.93529: entering _queue_task() for managed_node2/package 8119 1726773025.93674: worker is 1 (out of 1 available) 8119 1726773025.93710: exiting _queue_task() for managed_node2/package 8119 1726773025.93780: done queuing things up, now waiting for results queue to drain 8119 1726773025.93823: waiting for pending results... 8609 1726773025.94112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8609 1726773025.94159: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000a8 8609 1726773025.94204: calling self._execute() 8609 1726773025.96671: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8609 1726773025.96789: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8609 1726773025.96856: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8609 1726773025.96894: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8609 1726773025.96933: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8609 1726773025.96969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8609 1726773025.97025: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8609 1726773025.97057: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8609 1726773025.97082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8609 1726773025.97202: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8609 1726773025.97225: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8609 1726773025.97243: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8609 1726773025.97415: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8609 1726773025.97420: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8609 1726773025.97423: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8609 1726773025.97426: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8609 1726773025.97429: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8609 1726773025.97431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8609 1726773025.97434: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8609 1726773025.97436: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8609 1726773025.97439: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8609 1726773025.97459: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8609 1726773025.97462: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8609 1726773025.97465: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8609 1726773025.97716: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8609 1726773025.97766: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8609 1726773025.97779: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8609 1726773025.97797: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8609 1726773025.97804: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8609 1726773025.97927: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8609 1726773025.97938: starting attempt loop 8609 1726773025.97941: running the handler 8609 1726773025.98097: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 8609 1726773025.98111: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 8609 1726773025.98127: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 8609 1726773025.98142: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 8609 1726773025.98154: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 8609 1726773025.98173: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 8609 1726773025.98188: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 8609 1726773025.98194: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 8609 1726773025.98198: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 8609 1726773025.98204: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 8609 1726773025.98211: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 8609 1726773025.98218: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 8609 1726773025.98230: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 8609 1726773025.98241: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 8609 1726773025.98359: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 8609 1726773025.98369: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 8609 1726773025.98390: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 8609 1726773025.98396: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 8609 1726773025.98404: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 8609 1726773025.98471: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 8609 1726773025.98479: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 8609 1726773025.98585: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 8609 1726773025.98592: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 8609 1726773025.98600: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 8609 1726773025.98667: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 8609 1726773025.98674: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 8609 1726773025.98706: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 8609 1726773025.98723: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 8609 1726773025.98731: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 8609 1726773025.98737: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 8609 1726773025.98744: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 8609 1726773025.98750: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 8609 1726773025.98756: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 8609 1726773025.98762: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 8609 1726773025.98803: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 8609 1726773025.98813: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 8609 1726773025.98821: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 8609 1726773025.98996: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 8609 1726773025.99004: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 8609 1726773025.99011: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 8609 1726773025.99046: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 8609 1726773025.99535: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 8609 1726773025.99548: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 8609 1726773025.99557: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 8609 1726773025.99574: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 8609 1726773025.99592: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 8609 1726773025.99599: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 8609 1726773025.99606: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 8609 1726773025.99638: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 8609 1726773025.99659: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 8609 1726773025.99668: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 8609 1726773025.99674: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 8609 1726773025.99710: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 8609 1726773025.99717: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 8609 1726773025.99723: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 8609 1726773025.99747: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 8609 1726773025.99752: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 8609 1726773025.99758: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 8609 1726773025.99778: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 8609 1726773025.99833: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 8609 1726773025.99841: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 8609 1726773025.99849: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 8609 1726773025.99856: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 8609 1726773025.99943: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 8609 1726773025.99996: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 8609 1726773026.00006: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 8609 1726773026.00018: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 8609 1726773026.00028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 8609 1726773026.00061: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 8609 1726773026.00067: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 8609 1726773026.00074: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 8609 1726773026.00079: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 8609 1726773026.00089: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 8609 1726773026.00095: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 8609 1726773026.00102: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 8609 1726773026.00116: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 8609 1726773026.00123: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 8609 1726773026.00131: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 8609 1726773026.00137: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 8609 1726773026.00161: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 8609 1726773026.00168: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 8609 1726773026.00175: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 8609 1726773026.00278: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 8609 1726773026.00290: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 8609 1726773026.00311: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 8609 1726773026.00320: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 8609 1726773026.00329: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 8609 1726773026.00392: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 8609 1726773026.00401: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 8609 1726773026.00501: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 8609 1726773026.00511: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 8609 1726773026.00519: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 8609 1726773026.00580: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 8609 1726773026.00590: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 8609 1726773026.00624: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 8609 1726773026.00640: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 8609 1726773026.00649: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 8609 1726773026.00658: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 8609 1726773026.00667: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 8609 1726773026.00675: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 8609 1726773026.00681: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 8609 1726773026.00692: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 8609 1726773026.00726: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 8609 1726773026.00733: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 8609 1726773026.00742: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 8609 1726773026.00910: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 8609 1726773026.00919: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 8609 1726773026.00926: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 8609 1726773026.00956: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 8609 1726773026.01313: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 8609 1726773026.01321: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 8609 1726773026.01327: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 8609 1726773026.01338: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 8609 1726773026.01348: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 8609 1726773026.01352: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 8609 1726773026.01357: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 8609 1726773026.01378: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 8609 1726773026.01394: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 8609 1726773026.01403: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 8609 1726773026.01412: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 8609 1726773026.01437: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 8609 1726773026.01443: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 8609 1726773026.01447: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 8609 1726773026.01461: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 8609 1726773026.01465: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 8609 1726773026.01470: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 8609 1726773026.01481: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 8609 1726773026.01522: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 8609 1726773026.01531: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 8609 1726773026.01540: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 8609 1726773026.01545: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 8609 1726773026.01596: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 8609 1726773026.01617: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 8609 1726773026.01622: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 8609 1726773026.01628: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 8609 1726773026.01636: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 8609 1726773026.01658: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 8609 1726773026.01663: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 8609 1726773026.01668: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 8609 1726773026.01672: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 8609 1726773026.01677: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 8609 1726773026.01680: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 8609 1726773026.01687: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 8609 1726773026.01696: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 8609 1726773026.01701: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 8609 1726773026.01709: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 8609 1726773026.01714: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 8609 1726773026.01729: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 8609 1726773026.01755: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 8609 1726773026.01774: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 8609 1726773026.01832: _low_level_execute_command(): starting 8609 1726773026.01837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8609 1726773026.04350: stdout chunk (state=2): >>>/root <<< 8609 1726773026.04469: stderr chunk (state=3): >>><<< 8609 1726773026.04474: stdout chunk (state=3): >>><<< 8609 1726773026.04499: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8609 1726773026.04517: _low_level_execute_command(): starting 8609 1726773026.04523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405 `" && echo ansible-tmp-1726773026.0451095-8609-70847582807405="` echo /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405 `" ) && sleep 0' 8609 1726773026.07175: stdout chunk (state=2): >>>ansible-tmp-1726773026.0451095-8609-70847582807405=/root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405 <<< 8609 1726773026.07310: stderr chunk (state=3): >>><<< 8609 1726773026.07315: stdout chunk (state=3): >>><<< 8609 1726773026.07333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773026.0451095-8609-70847582807405=/root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405 , stderr= 8609 1726773026.07452: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/dnf-ZIP_DEFLATED 8609 1726773026.07525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/AnsiballZ_dnf.py 8609 1726773026.07835: Sending initial data 8609 1726773026.07853: Sent initial data (149 bytes) 8609 1726773026.10299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqf1d33m4 /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/AnsiballZ_dnf.py <<< 8609 1726773026.12160: stderr chunk (state=3): >>><<< 8609 1726773026.12170: stdout chunk (state=3): >>><<< 8609 1726773026.12195: done transferring module to remote 8609 1726773026.12211: _low_level_execute_command(): starting 8609 1726773026.12216: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/ /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/AnsiballZ_dnf.py && sleep 0' 8609 1726773026.14971: stderr chunk (state=2): >>><<< 8609 1726773026.14989: stdout chunk (state=2): >>><<< 8609 1726773026.15015: _low_level_execute_command() done: rc=0, stdout=, stderr= 8609 1726773026.15023: _low_level_execute_command(): starting 8609 1726773026.15031: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/AnsiballZ_dnf.py && sleep 0' 8609 1726773028.72223: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8609 1726773028.74585: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8609 1726773028.74595: stdout chunk (state=3): >>><<< 8609 1726773028.74607: stderr chunk (state=3): >>><<< 8609 1726773028.74631: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8609 1726773028.74679: done with _execute_module (dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8609 1726773028.74697: _low_level_execute_command(): starting 8609 1726773028.74705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773026.0451095-8609-70847582807405/ > /dev/null 2>&1 && sleep 0' 8609 1726773028.78133: stderr chunk (state=2): >>><<< 8609 1726773028.78148: stdout chunk (state=2): >>><<< 8609 1726773028.78166: _low_level_execute_command() done: rc=0, stdout=, stderr= 8609 1726773028.78174: handler run complete 8609 1726773028.78229: attempt loop complete, returning result 8609 1726773028.78246: _execute() done 8609 1726773028.78251: dumping result to json 8609 1726773028.78257: done dumping result, returning 8609 1726773028.78276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-0000000000a8] 8609 1726773028.78296: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a8 8609 1726773028.78346: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000a8 8609 1726773028.78350: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773028.78922: no more pending results, returning what we have 8119 1726773028.78928: results queue empty 8119 1726773028.78931: checking for any_errors_fatal 8119 1726773028.78936: done checking for any_errors_fatal 8119 1726773028.78939: checking for max_fail_percentage 8119 1726773028.78942: done checking for max_fail_percentage 8119 1726773028.78944: checking to see if all hosts have failed and the running result is not ok 8119 1726773028.78947: done checking to see if all hosts have failed 8119 1726773028.78949: getting the remaining hosts for this loop 8119 1726773028.78952: done getting the remaining hosts for this loop 8119 1726773028.78960: building list of next tasks for hosts 8119 1726773028.78963: getting the next task for host managed_node2 8119 1726773028.78974: done getting next task for host managed_node2 8119 1726773028.78979: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773028.78985: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.78989: done building task lists 8119 1726773028.78991: counting tasks in each state of execution 8119 1726773028.78995: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773028.78998: advancing hosts in ITERATING_TASKS 8119 1726773028.79000: starting to advance hosts 8119 1726773028.79003: getting the next task for host managed_node2 8119 1726773028.79009: done getting next task for host managed_node2 8119 1726773028.79012: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773028.79016: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.79018: done advancing hosts to next task 8119 1726773028.79077: Loading ActionModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773028.79085: getting variables 8119 1726773028.79089: in VariableManager get_vars() 8119 1726773028.79125: Calling all_inventory to load vars for managed_node2 8119 1726773028.79132: Calling groups_inventory to load vars for managed_node2 8119 1726773028.79136: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773028.79168: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79185: Calling all_plugins_play to load vars for managed_node2 8119 1726773028.79206: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79221: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773028.79240: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79250: Calling groups_plugins_play to load vars for managed_node2 8119 1726773028.79267: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79300: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79327: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.79598: done with get_vars() 8119 1726773028.79608: done getting variables 8119 1726773028.79614: sending task start callback, copying the task so we can template it temporarily 8119 1726773028.79616: done copying, going to template now 8119 1726773028.79618: done templating 8119 1726773028.79620: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:10:28 -0400 (0:00:02.861) 0:00:23.352 **** 8119 1726773028.79639: sending task start callback 8119 1726773028.79641: entering _queue_task() for managed_node2/debug 8119 1726773028.79642: Creating lock for debug 8119 1726773028.79785: worker is 1 (out of 1 available) 8119 1726773028.79824: exiting _queue_task() for managed_node2/debug 8119 1726773028.79897: done queuing things up, now waiting for results queue to drain 8119 1726773028.79902: waiting for pending results... 8739 1726773028.79967: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8739 1726773028.80035: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000aa 8739 1726773028.80091: calling self._execute() 8739 1726773028.82535: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8739 1726773028.82648: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8739 1726773028.82723: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8739 1726773028.82760: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8739 1726773028.82801: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8739 1726773028.82839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8739 1726773028.82902: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8739 1726773028.82940: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8739 1726773028.82967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8739 1726773028.83147: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8739 1726773028.83174: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8739 1726773028.83200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8739 1726773028.83570: when evaluation is False, skipping this task 8739 1726773028.83576: _execute() done 8739 1726773028.83579: dumping result to json 8739 1726773028.83582: done dumping result, returning 8739 1726773028.83592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000000aa] 8739 1726773028.83605: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000aa 8739 1726773028.83649: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000aa 8739 1726773028.83653: WORKER PROCESS EXITING skipping: [managed_node2] => {} 8119 1726773028.84165: no more pending results, returning what we have 8119 1726773028.84170: results queue empty 8119 1726773028.84172: checking for any_errors_fatal 8119 1726773028.84179: done checking for any_errors_fatal 8119 1726773028.84181: checking for max_fail_percentage 8119 1726773028.84186: done checking for max_fail_percentage 8119 1726773028.84188: checking to see if all hosts have failed and the running result is not ok 8119 1726773028.84191: done checking to see if all hosts have failed 8119 1726773028.84193: getting the remaining hosts for this loop 8119 1726773028.84195: done getting the remaining hosts for this loop 8119 1726773028.84203: building list of next tasks for hosts 8119 1726773028.84205: getting the next task for host managed_node2 8119 1726773028.84215: done getting next task for host managed_node2 8119 1726773028.84220: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773028.84225: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.84227: done building task lists 8119 1726773028.84229: counting tasks in each state of execution 8119 1726773028.84231: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773028.84233: advancing hosts in ITERATING_TASKS 8119 1726773028.84234: starting to advance hosts 8119 1726773028.84236: getting the next task for host managed_node2 8119 1726773028.84238: done getting next task for host managed_node2 8119 1726773028.84240: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773028.84242: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.84243: done advancing hosts to next task 8119 1726773028.84319: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773028.84325: getting variables 8119 1726773028.84327: in VariableManager get_vars() 8119 1726773028.84356: Calling all_inventory to load vars for managed_node2 8119 1726773028.84360: Calling groups_inventory to load vars for managed_node2 8119 1726773028.84362: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773028.84382: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84396: Calling all_plugins_play to load vars for managed_node2 8119 1726773028.84406: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84418: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773028.84429: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84435: Calling groups_plugins_play to load vars for managed_node2 8119 1726773028.84444: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84468: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84485: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.84864: done with get_vars() 8119 1726773028.84877: done getting variables 8119 1726773028.84886: sending task start callback, copying the task so we can template it temporarily 8119 1726773028.84890: done copying, going to template now 8119 1726773028.84893: done templating 8119 1726773028.84895: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.052) 0:00:23.405 **** 8119 1726773028.84922: sending task start callback 8119 1726773028.84926: entering _queue_task() for managed_node2/reboot 8119 1726773028.84929: Creating lock for reboot 8119 1726773028.85090: worker is 1 (out of 1 available) 8119 1726773028.85128: exiting _queue_task() for managed_node2/reboot 8119 1726773028.85196: done queuing things up, now waiting for results queue to drain 8119 1726773028.85201: waiting for pending results... 8745 1726773028.85407: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8745 1726773028.85475: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000ab 8745 1726773028.85542: calling self._execute() 8745 1726773028.88429: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8745 1726773028.88546: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8745 1726773028.88619: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8745 1726773028.88658: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8745 1726773028.88703: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8745 1726773028.88746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8745 1726773028.88811: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8745 1726773028.88844: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8745 1726773028.88869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8745 1726773028.88981: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8745 1726773028.89028: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8745 1726773028.89050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8745 1726773028.89419: when evaluation is False, skipping this task 8745 1726773028.89424: _execute() done 8745 1726773028.89427: dumping result to json 8745 1726773028.89430: done dumping result, returning 8745 1726773028.89436: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [12a3200b-1e9d-1dbd-cc52-0000000000ab] 8745 1726773028.89448: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ab 8745 1726773028.89489: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ab 8745 1726773028.89493: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773028.90280: no more pending results, returning what we have 8119 1726773028.90287: results queue empty 8119 1726773028.90290: checking for any_errors_fatal 8119 1726773028.90295: done checking for any_errors_fatal 8119 1726773028.90297: checking for max_fail_percentage 8119 1726773028.90300: done checking for max_fail_percentage 8119 1726773028.90302: checking to see if all hosts have failed and the running result is not ok 8119 1726773028.90304: done checking to see if all hosts have failed 8119 1726773028.90306: getting the remaining hosts for this loop 8119 1726773028.90311: done getting the remaining hosts for this loop 8119 1726773028.90320: building list of next tasks for hosts 8119 1726773028.90323: getting the next task for host managed_node2 8119 1726773028.90331: done getting next task for host managed_node2 8119 1726773028.90337: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773028.90341: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.90344: done building task lists 8119 1726773028.90346: counting tasks in each state of execution 8119 1726773028.90350: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773028.90353: advancing hosts in ITERATING_TASKS 8119 1726773028.90355: starting to advance hosts 8119 1726773028.90357: getting the next task for host managed_node2 8119 1726773028.90361: done getting next task for host managed_node2 8119 1726773028.90364: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773028.90368: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.90370: done advancing hosts to next task 8119 1726773028.90388: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773028.90393: getting variables 8119 1726773028.90397: in VariableManager get_vars() 8119 1726773028.90435: Calling all_inventory to load vars for managed_node2 8119 1726773028.90442: Calling groups_inventory to load vars for managed_node2 8119 1726773028.90446: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773028.90475: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90495: Calling all_plugins_play to load vars for managed_node2 8119 1726773028.90514: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90528: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773028.90542: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90551: Calling groups_plugins_play to load vars for managed_node2 8119 1726773028.90564: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90592: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90619: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.90943: done with get_vars() 8119 1726773028.90956: done getting variables 8119 1726773028.90963: sending task start callback, copying the task so we can template it temporarily 8119 1726773028.90965: done copying, going to template now 8119 1726773028.90968: done templating 8119 1726773028.90970: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.060) 0:00:23.466 **** 8119 1726773028.90995: sending task start callback 8119 1726773028.90998: entering _queue_task() for managed_node2/fail 8119 1726773028.91144: worker is 1 (out of 1 available) 8119 1726773028.91180: exiting _queue_task() for managed_node2/fail 8119 1726773028.91250: done queuing things up, now waiting for results queue to drain 8119 1726773028.91254: waiting for pending results... 8750 1726773028.91701: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8750 1726773028.91773: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000ac 8750 1726773028.91829: calling self._execute() 8750 1726773028.94282: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8750 1726773028.94401: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8750 1726773028.94490: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8750 1726773028.94534: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8750 1726773028.94575: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8750 1726773028.94619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8750 1726773028.94678: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8750 1726773028.94716: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8750 1726773028.94743: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8750 1726773028.94857: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8750 1726773028.94881: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8750 1726773028.94943: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8750 1726773028.95285: when evaluation is False, skipping this task 8750 1726773028.95291: _execute() done 8750 1726773028.95294: dumping result to json 8750 1726773028.95297: done dumping result, returning 8750 1726773028.95303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [12a3200b-1e9d-1dbd-cc52-0000000000ac] 8750 1726773028.95317: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ac 8750 1726773028.95374: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ac 8750 1726773028.95378: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773028.95755: no more pending results, returning what we have 8119 1726773028.95761: results queue empty 8119 1726773028.95763: checking for any_errors_fatal 8119 1726773028.95768: done checking for any_errors_fatal 8119 1726773028.95772: checking for max_fail_percentage 8119 1726773028.95775: done checking for max_fail_percentage 8119 1726773028.95777: checking to see if all hosts have failed and the running result is not ok 8119 1726773028.95780: done checking to see if all hosts have failed 8119 1726773028.95782: getting the remaining hosts for this loop 8119 1726773028.95787: done getting the remaining hosts for this loop 8119 1726773028.95795: building list of next tasks for hosts 8119 1726773028.95799: getting the next task for host managed_node2 8119 1726773028.95812: done getting next task for host managed_node2 8119 1726773028.95817: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773028.95822: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.95825: done building task lists 8119 1726773028.95827: counting tasks in each state of execution 8119 1726773028.95831: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773028.95834: advancing hosts in ITERATING_TASKS 8119 1726773028.95836: starting to advance hosts 8119 1726773028.95838: getting the next task for host managed_node2 8119 1726773028.95844: done getting next task for host managed_node2 8119 1726773028.95847: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773028.95851: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773028.95853: done advancing hosts to next task 8119 1726773028.95898: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773028.95905: getting variables 8119 1726773028.95911: in VariableManager get_vars() 8119 1726773028.95946: Calling all_inventory to load vars for managed_node2 8119 1726773028.95953: Calling groups_inventory to load vars for managed_node2 8119 1726773028.95957: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773028.95988: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96005: Calling all_plugins_play to load vars for managed_node2 8119 1726773028.96028: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96043: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773028.96062: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96074: Calling groups_plugins_play to load vars for managed_node2 8119 1726773028.96094: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96129: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96154: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773028.96484: done with get_vars() 8119 1726773028.96500: done getting variables 8119 1726773028.96507: sending task start callback, copying the task so we can template it temporarily 8119 1726773028.96512: done copying, going to template now 8119 1726773028.96515: done templating 8119 1726773028.96517: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.055) 0:00:23.521 **** 8119 1726773028.96542: sending task start callback 8119 1726773028.96545: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773028.96548: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8119 1726773028.96734: worker is 1 (out of 1 available) 8119 1726773028.96770: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773028.96844: done queuing things up, now waiting for results queue to drain 8119 1726773028.96850: waiting for pending results... 8754 1726773028.97098: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8754 1726773028.97171: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000ae 8754 1726773028.97227: calling self._execute() 8754 1726773028.99420: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8754 1726773028.99504: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8754 1726773028.99563: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8754 1726773028.99592: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8754 1726773028.99625: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8754 1726773028.99657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8754 1726773028.99702: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8754 1726773028.99730: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8754 1726773028.99748: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8754 1726773028.99845: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8754 1726773028.99862: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8754 1726773028.99880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8754 1726773029.00127: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8754 1726773029.00164: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8754 1726773029.00174: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8754 1726773029.00186: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8754 1726773029.00193: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8754 1726773029.00277: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8754 1726773029.00291: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8754 1726773029.00317: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8754 1726773029.00331: starting attempt loop 8754 1726773029.00335: running the handler 8754 1726773029.00343: _low_level_execute_command(): starting 8754 1726773029.00347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8754 1726773029.02933: stdout chunk (state=2): >>>/root <<< 8754 1726773029.03036: stderr chunk (state=3): >>><<< 8754 1726773029.03041: stdout chunk (state=3): >>><<< 8754 1726773029.03060: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8754 1726773029.03078: _low_level_execute_command(): starting 8754 1726773029.03088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962 `" && echo ansible-tmp-1726773029.030704-8754-260935170948962="` echo /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962 `" ) && sleep 0' 8754 1726773029.05747: stdout chunk (state=2): >>>ansible-tmp-1726773029.030704-8754-260935170948962=/root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962 <<< 8754 1726773029.05869: stderr chunk (state=3): >>><<< 8754 1726773029.05875: stdout chunk (state=3): >>><<< 8754 1726773029.05894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.030704-8754-260935170948962=/root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962 , stderr= 8754 1726773029.05982: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8754 1726773029.05987: ANSIBALLZ: Acquiring lock 8754 1726773029.05991: ANSIBALLZ: Lock acquired: 140408671117712 8754 1726773029.05993: ANSIBALLZ: Creating module 8754 1726773029.17264: ANSIBALLZ: Writing module into payload 8754 1726773029.17341: ANSIBALLZ: Writing module 8754 1726773029.17367: ANSIBALLZ: Renaming module 8754 1726773029.17373: ANSIBALLZ: Done creating module 8754 1726773029.17405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/AnsiballZ_kernel_settings_get_config.py 8754 1726773029.18121: Sending initial data 8754 1726773029.18132: Sent initial data (172 bytes) 8754 1726773029.20707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpigo88ghd /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/AnsiballZ_kernel_settings_get_config.py <<< 8754 1726773029.21887: stderr chunk (state=3): >>><<< 8754 1726773029.21895: stdout chunk (state=3): >>><<< 8754 1726773029.21925: done transferring module to remote 8754 1726773029.21945: _low_level_execute_command(): starting 8754 1726773029.21952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/ /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8754 1726773029.24634: stderr chunk (state=2): >>><<< 8754 1726773029.24651: stdout chunk (state=2): >>><<< 8754 1726773029.24676: _low_level_execute_command() done: rc=0, stdout=, stderr= 8754 1726773029.24684: _low_level_execute_command(): starting 8754 1726773029.24696: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8754 1726773029.40218: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8754 1726773029.41190: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8754 1726773029.41240: stderr chunk (state=3): >>><<< 8754 1726773029.41248: stdout chunk (state=3): >>><<< 8754 1726773029.41269: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 8754 1726773029.41298: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8754 1726773029.41311: _low_level_execute_command(): starting 8754 1726773029.41316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.030704-8754-260935170948962/ > /dev/null 2>&1 && sleep 0' 8754 1726773029.44198: stderr chunk (state=2): >>><<< 8754 1726773029.44209: stdout chunk (state=2): >>><<< 8754 1726773029.44231: _low_level_execute_command() done: rc=0, stdout=, stderr= 8754 1726773029.44240: handler run complete 8754 1726773029.44278: attempt loop complete, returning result 8754 1726773029.44293: _execute() done 8754 1726773029.44296: dumping result to json 8754 1726773029.44299: done dumping result, returning 8754 1726773029.44313: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [12a3200b-1e9d-1dbd-cc52-0000000000ae] 8754 1726773029.44327: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ae 8754 1726773029.44363: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ae 8754 1726773029.44369: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8119 1726773029.44576: no more pending results, returning what we have 8119 1726773029.44581: results queue empty 8119 1726773029.44585: checking for any_errors_fatal 8119 1726773029.44589: done checking for any_errors_fatal 8119 1726773029.44592: checking for max_fail_percentage 8119 1726773029.44595: done checking for max_fail_percentage 8119 1726773029.44597: checking to see if all hosts have failed and the running result is not ok 8119 1726773029.44599: done checking to see if all hosts have failed 8119 1726773029.44601: getting the remaining hosts for this loop 8119 1726773029.44604: done getting the remaining hosts for this loop 8119 1726773029.44611: building list of next tasks for hosts 8119 1726773029.44614: getting the next task for host managed_node2 8119 1726773029.44621: done getting next task for host managed_node2 8119 1726773029.44625: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773029.44629: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773029.44631: done building task lists 8119 1726773029.44633: counting tasks in each state of execution 8119 1726773029.44637: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773029.44639: advancing hosts in ITERATING_TASKS 8119 1726773029.44641: starting to advance hosts 8119 1726773029.44643: getting the next task for host managed_node2 8119 1726773029.44647: done getting next task for host managed_node2 8119 1726773029.44650: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773029.44653: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773029.44655: done advancing hosts to next task 8119 1726773029.44669: getting variables 8119 1726773029.44672: in VariableManager get_vars() 8119 1726773029.44709: Calling all_inventory to load vars for managed_node2 8119 1726773029.44716: Calling groups_inventory to load vars for managed_node2 8119 1726773029.44720: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773029.44746: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.44758: Calling all_plugins_play to load vars for managed_node2 8119 1726773029.44768: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.44776: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773029.44793: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.44804: Calling groups_plugins_play to load vars for managed_node2 8119 1726773029.44816: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.44836: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.44851: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773029.45086: done with get_vars() 8119 1726773029.45096: done getting variables 8119 1726773029.45101: sending task start callback, copying the task so we can template it temporarily 8119 1726773029.45103: done copying, going to template now 8119 1726773029.45105: done templating 8119 1726773029.45106: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:10:29 -0400 (0:00:00.485) 0:00:24.007 **** 8119 1726773029.45135: sending task start callback 8119 1726773029.45138: entering _queue_task() for managed_node2/stat 8119 1726773029.45267: worker is 1 (out of 1 available) 8119 1726773029.45305: exiting _queue_task() for managed_node2/stat 8119 1726773029.45376: done queuing things up, now waiting for results queue to drain 8119 1726773029.45380: waiting for pending results... 8783 1726773029.45630: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8783 1726773029.45700: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000af 8783 1726773029.47947: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8783 1726773029.48036: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8783 1726773029.48087: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8783 1726773029.48123: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8783 1726773029.48164: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8783 1726773029.48196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8783 1726773029.48239: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8783 1726773029.48285: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8783 1726773029.48305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8783 1726773029.48387: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8783 1726773029.48409: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8783 1726773029.48425: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8783 1726773029.49062: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8783 1726773029.49068: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8783 1726773029.49072: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8783 1726773029.49076: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8783 1726773029.49079: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8783 1726773029.49082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.49087: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8783 1726773029.49090: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8783 1726773029.49093: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8783 1726773029.49122: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8783 1726773029.49126: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8783 1726773029.49129: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.49514: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8783 1726773029.49526: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8783 1726773029.49530: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8783 1726773029.49534: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8783 1726773029.49537: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8783 1726773029.49540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.49543: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8783 1726773029.49549: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8783 1726773029.49551: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8783 1726773029.49571: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8783 1726773029.49574: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8783 1726773029.49576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.49781: when evaluation is False, skipping this task 8783 1726773029.49833: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8783 1726773029.49838: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8783 1726773029.49992: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8783 1726773029.49996: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8783 1726773029.49999: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8783 1726773029.50002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.50005: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8783 1726773029.50008: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8783 1726773029.50011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8783 1726773029.50040: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8783 1726773029.50044: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8783 1726773029.50048: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.50472: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8783 1726773029.50478: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8783 1726773029.50481: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8783 1726773029.50487: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8783 1726773029.50492: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8783 1726773029.50495: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.50499: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8783 1726773029.50502: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8783 1726773029.50505: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8783 1726773029.50537: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8783 1726773029.50542: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8783 1726773029.50545: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.50908: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8783 1726773029.50965: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8783 1726773029.50981: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8783 1726773029.51001: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8783 1726773029.51013: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "item": "", "skip_reason": "Conditional result was False" } 8783 1726773029.52006: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8783 1726773029.52027: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8783 1726773029.52063: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8783 1726773029.52077: starting attempt loop 8783 1726773029.52080: running the handler 8783 1726773029.52092: _low_level_execute_command(): starting 8783 1726773029.52097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8783 1726773029.56720: stdout chunk (state=2): >>>/root <<< 8783 1726773029.56833: stderr chunk (state=3): >>><<< 8783 1726773029.56839: stdout chunk (state=3): >>><<< 8783 1726773029.56861: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8783 1726773029.56883: _low_level_execute_command(): starting 8783 1726773029.56894: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019 `" && echo ansible-tmp-1726773029.5687392-8783-156922219061019="` echo /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019 `" ) && sleep 0' 8783 1726773029.59522: stdout chunk (state=2): >>>ansible-tmp-1726773029.5687392-8783-156922219061019=/root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019 <<< 8783 1726773029.59644: stderr chunk (state=3): >>><<< 8783 1726773029.59652: stdout chunk (state=3): >>><<< 8783 1726773029.59671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.5687392-8783-156922219061019=/root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019 , stderr= 8783 1726773029.59763: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8783 1726773029.59825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/AnsiballZ_stat.py 8783 1726773029.60152: Sending initial data 8783 1726773029.60168: Sent initial data (151 bytes) 8783 1726773029.62647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp4y_pmqta /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/AnsiballZ_stat.py <<< 8783 1726773029.63625: stderr chunk (state=3): >>><<< 8783 1726773029.63632: stdout chunk (state=3): >>><<< 8783 1726773029.63658: done transferring module to remote 8783 1726773029.63672: _low_level_execute_command(): starting 8783 1726773029.63677: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/ /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/AnsiballZ_stat.py && sleep 0' 8783 1726773029.66291: stderr chunk (state=2): >>><<< 8783 1726773029.66303: stdout chunk (state=2): >>><<< 8783 1726773029.66324: _low_level_execute_command() done: rc=0, stdout=, stderr= 8783 1726773029.66328: _low_level_execute_command(): starting 8783 1726773029.66335: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/AnsiballZ_stat.py && sleep 0' 8783 1726773029.80789: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8783 1726773029.81744: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8783 1726773029.81797: stderr chunk (state=3): >>><<< 8783 1726773029.81803: stdout chunk (state=3): >>><<< 8783 1726773029.81824: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 8783 1726773029.81849: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8783 1726773029.81860: _low_level_execute_command(): starting 8783 1726773029.81864: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.5687392-8783-156922219061019/ > /dev/null 2>&1 && sleep 0' 8783 1726773029.84861: stderr chunk (state=2): >>><<< 8783 1726773029.84874: stdout chunk (state=2): >>><<< 8783 1726773029.84897: _low_level_execute_command() done: rc=0, stdout=, stderr= 8783 1726773029.84905: handler run complete 8783 1726773029.84929: attempt loop complete, returning result 8783 1726773029.85224: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8783 1726773029.85231: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8783 1726773029.85235: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8783 1726773029.85239: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8783 1726773029.85242: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8783 1726773029.85245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.85250: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8783 1726773029.85259: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8783 1726773029.85262: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8783 1726773029.85292: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8783 1726773029.85296: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8783 1726773029.85299: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8783 1726773029.85618: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8783 1726773029.85627: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8783 1726773029.85631: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8783 1726773029.85713: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8783 1726773029.85735: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8783 1726773029.85742: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8783 1726773029.85748: starting attempt loop 8783 1726773029.85749: running the handler 8783 1726773029.85754: _low_level_execute_command(): starting 8783 1726773029.85758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8783 1726773029.88770: stdout chunk (state=2): >>>/root <<< 8783 1726773029.88786: stderr chunk (state=2): >>><<< 8783 1726773029.88799: stdout chunk (state=3): >>><<< 8783 1726773029.88821: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8783 1726773029.88838: _low_level_execute_command(): starting 8783 1726773029.88846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349 `" && echo ansible-tmp-1726773029.8883135-8783-147884258865349="` echo /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349 `" ) && sleep 0' 8783 1726773029.91688: stdout chunk (state=2): >>>ansible-tmp-1726773029.8883135-8783-147884258865349=/root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349 <<< 8783 1726773029.91822: stderr chunk (state=3): >>><<< 8783 1726773029.91831: stdout chunk (state=3): >>><<< 8783 1726773029.91855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.8883135-8783-147884258865349=/root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349 , stderr= 8783 1726773029.91964: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8783 1726773029.92027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/AnsiballZ_stat.py 8783 1726773029.92764: Sending initial data 8783 1726773029.92780: Sent initial data (151 bytes) 8783 1726773029.95155: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmplwo7vmtx /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/AnsiballZ_stat.py <<< 8783 1726773029.96297: stderr chunk (state=3): >>><<< 8783 1726773029.96302: stdout chunk (state=3): >>><<< 8783 1726773029.96327: done transferring module to remote 8783 1726773029.96340: _low_level_execute_command(): starting 8783 1726773029.96344: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/ /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/AnsiballZ_stat.py && sleep 0' 8783 1726773029.99091: stderr chunk (state=2): >>><<< 8783 1726773029.99104: stdout chunk (state=2): >>><<< 8783 1726773029.99123: _low_level_execute_command() done: rc=0, stdout=, stderr= 8783 1726773029.99127: _low_level_execute_command(): starting 8783 1726773029.99134: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/AnsiballZ_stat.py && sleep 0' 8783 1726773030.19680: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773024.9403431, "mtime": 1726773022.9623468, "ctime": 1726773022.9623468, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8783 1726773030.20855: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8783 1726773030.20865: stdout chunk (state=3): >>><<< 8783 1726773030.20875: stderr chunk (state=3): >>><<< 8783 1726773030.20892: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773024.9403431, "mtime": 1726773022.9623468, "ctime": 1726773022.9623468, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 8783 1726773030.20965: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8783 1726773030.20978: _low_level_execute_command(): starting 8783 1726773030.20986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.8883135-8783-147884258865349/ > /dev/null 2>&1 && sleep 0' 8783 1726773030.23614: stderr chunk (state=2): >>><<< 8783 1726773030.23627: stdout chunk (state=2): >>><<< 8783 1726773030.23656: _low_level_execute_command() done: rc=0, stdout=, stderr= 8783 1726773030.23666: handler run complete 8783 1726773030.23725: attempt loop complete, returning result 8783 1726773030.23908: dumping result to json 8783 1726773030.23924: done dumping result, returning 8783 1726773030.23939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [12a3200b-1e9d-1dbd-cc52-0000000000af] 8783 1726773030.23948: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000af 8783 1726773030.23952: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000af 8783 1726773030.23955: WORKER PROCESS EXITING ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773024.9403431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773022.9623468, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773022.9623468, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8119 1726773030.24271: no more pending results, returning what we have 8119 1726773030.24278: results queue empty 8119 1726773030.24280: checking for any_errors_fatal 8119 1726773030.24287: done checking for any_errors_fatal 8119 1726773030.24290: checking for max_fail_percentage 8119 1726773030.24293: done checking for max_fail_percentage 8119 1726773030.24295: checking to see if all hosts have failed and the running result is not ok 8119 1726773030.24297: done checking to see if all hosts have failed 8119 1726773030.24299: getting the remaining hosts for this loop 8119 1726773030.24302: done getting the remaining hosts for this loop 8119 1726773030.24308: building list of next tasks for hosts 8119 1726773030.24311: getting the next task for host managed_node2 8119 1726773030.24320: done getting next task for host managed_node2 8119 1726773030.24330: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773030.24335: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.24337: done building task lists 8119 1726773030.24339: counting tasks in each state of execution 8119 1726773030.24344: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773030.24346: advancing hosts in ITERATING_TASKS 8119 1726773030.24349: starting to advance hosts 8119 1726773030.24351: getting the next task for host managed_node2 8119 1726773030.24354: done getting next task for host managed_node2 8119 1726773030.24357: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773030.24360: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.24362: done advancing hosts to next task 8119 1726773030.24380: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773030.24388: getting variables 8119 1726773030.24392: in VariableManager get_vars() 8119 1726773030.24434: Calling all_inventory to load vars for managed_node2 8119 1726773030.24442: Calling groups_inventory to load vars for managed_node2 8119 1726773030.24446: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773030.24480: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24497: Calling all_plugins_play to load vars for managed_node2 8119 1726773030.24513: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24525: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773030.24540: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24548: Calling groups_plugins_play to load vars for managed_node2 8119 1726773030.24561: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24588: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24610: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.24917: done with get_vars() 8119 1726773030.24931: done getting variables 8119 1726773030.24939: sending task start callback, copying the task so we can template it temporarily 8119 1726773030.24942: done copying, going to template now 8119 1726773030.24946: done templating 8119 1726773030.24948: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.798) 0:00:24.806 **** 8119 1726773030.24969: sending task start callback 8119 1726773030.24972: entering _queue_task() for managed_node2/set_fact 8119 1726773030.25565: worker is 1 (out of 1 available) 8119 1726773030.25603: exiting _queue_task() for managed_node2/set_fact 8843 1726773030.25611: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773030.25672: done queuing things up, now waiting for results queue to drain 8119 1726773030.25677: waiting for pending results... 8843 1726773030.25686: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b0 8843 1726773030.25739: calling self._execute() 8843 1726773030.27923: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8843 1726773030.28035: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8843 1726773030.28104: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8843 1726773030.28142: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8843 1726773030.28178: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8843 1726773030.28215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8843 1726773030.28271: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8843 1726773030.28324: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8843 1726773030.28350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8843 1726773030.28454: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8843 1726773030.28476: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8843 1726773030.28502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8843 1726773030.29057: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8843 1726773030.29100: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8843 1726773030.29115: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8843 1726773030.29129: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8843 1726773030.29135: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8843 1726773030.29233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8843 1726773030.29248: starting attempt loop 8843 1726773030.29250: running the handler 8843 1726773030.29262: handler run complete 8843 1726773030.29265: attempt loop complete, returning result 8843 1726773030.29267: _execute() done 8843 1726773030.29269: dumping result to json 8843 1726773030.29270: done dumping result, returning 8843 1726773030.29275: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [12a3200b-1e9d-1dbd-cc52-0000000000b0] 8843 1726773030.29281: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b0 8843 1726773030.29329: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b0 8843 1726773030.29333: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8119 1726773030.29561: no more pending results, returning what we have 8119 1726773030.29567: results queue empty 8119 1726773030.29569: checking for any_errors_fatal 8119 1726773030.29575: done checking for any_errors_fatal 8119 1726773030.29577: checking for max_fail_percentage 8119 1726773030.29580: done checking for max_fail_percentage 8119 1726773030.29582: checking to see if all hosts have failed and the running result is not ok 8119 1726773030.29586: done checking to see if all hosts have failed 8119 1726773030.29588: getting the remaining hosts for this loop 8119 1726773030.29590: done getting the remaining hosts for this loop 8119 1726773030.29598: building list of next tasks for hosts 8119 1726773030.29601: getting the next task for host managed_node2 8119 1726773030.29612: done getting next task for host managed_node2 8119 1726773030.29619: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773030.29623: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.29625: done building task lists 8119 1726773030.29626: counting tasks in each state of execution 8119 1726773030.29629: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773030.29631: advancing hosts in ITERATING_TASKS 8119 1726773030.29632: starting to advance hosts 8119 1726773030.29634: getting the next task for host managed_node2 8119 1726773030.29637: done getting next task for host managed_node2 8119 1726773030.29638: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773030.29640: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.29642: done advancing hosts to next task 8119 1726773030.29653: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773030.29656: getting variables 8119 1726773030.29658: in VariableManager get_vars() 8119 1726773030.29693: Calling all_inventory to load vars for managed_node2 8119 1726773030.29698: Calling groups_inventory to load vars for managed_node2 8119 1726773030.29701: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773030.29723: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.29734: Calling all_plugins_play to load vars for managed_node2 8119 1726773030.29748: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.29764: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773030.29777: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.29786: Calling groups_plugins_play to load vars for managed_node2 8119 1726773030.29800: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.29832: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.29847: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.30162: done with get_vars() 8119 1726773030.30174: done getting variables 8119 1726773030.30180: sending task start callback, copying the task so we can template it temporarily 8119 1726773030.30182: done copying, going to template now 8119 1726773030.30187: done templating 8119 1726773030.30189: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.052) 0:00:24.858 **** 8119 1726773030.30212: sending task start callback 8119 1726773030.30214: entering _queue_task() for managed_node2/service 8119 1726773030.30354: worker is 1 (out of 1 available) 8119 1726773030.30391: exiting _queue_task() for managed_node2/service 8119 1726773030.30463: done queuing things up, now waiting for results queue to drain 8119 1726773030.30469: waiting for pending results... 8849 1726773030.30791: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8849 1726773030.30864: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b1 8849 1726773030.32590: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8849 1726773030.32687: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8849 1726773030.32737: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8849 1726773030.32769: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8849 1726773030.32804: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8849 1726773030.32852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8849 1726773030.32911: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8849 1726773030.32943: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8849 1726773030.32968: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8849 1726773030.33078: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8849 1726773030.33106: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8849 1726773030.33148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8849 1726773030.33341: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8849 1726773030.33346: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8849 1726773030.33349: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8849 1726773030.33352: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8849 1726773030.33355: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8849 1726773030.33357: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.33359: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8849 1726773030.33362: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8849 1726773030.33364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8849 1726773030.33386: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8849 1726773030.33391: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8849 1726773030.33393: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.33587: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8849 1726773030.33593: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8849 1726773030.33595: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8849 1726773030.33597: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8849 1726773030.33600: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8849 1726773030.33601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.33603: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8849 1726773030.33605: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8849 1726773030.33607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8849 1726773030.33626: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8849 1726773030.33629: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8849 1726773030.33631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.33730: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8849 1726773030.33762: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8849 1726773030.33772: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8849 1726773030.33782: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8849 1726773030.33791: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8849 1726773030.33898: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8849 1726773030.33912: starting attempt loop 8849 1726773030.33915: running the handler 8849 1726773030.34068: _low_level_execute_command(): starting 8849 1726773030.34076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8849 1726773030.36675: stdout chunk (state=2): >>>/root <<< 8849 1726773030.36793: stderr chunk (state=3): >>><<< 8849 1726773030.36799: stdout chunk (state=3): >>><<< 8849 1726773030.36821: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8849 1726773030.36839: _low_level_execute_command(): starting 8849 1726773030.36846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586 `" && echo ansible-tmp-1726773030.3683207-8849-146348559611586="` echo /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586 `" ) && sleep 0' 8849 1726773030.39516: stdout chunk (state=2): >>>ansible-tmp-1726773030.3683207-8849-146348559611586=/root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586 <<< 8849 1726773030.39638: stderr chunk (state=3): >>><<< 8849 1726773030.39643: stdout chunk (state=3): >>><<< 8849 1726773030.39662: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773030.3683207-8849-146348559611586=/root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586 , stderr= 8849 1726773030.39770: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 8849 1726773030.39863: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/AnsiballZ_systemd.py 8849 1726773030.40213: Sending initial data 8849 1726773030.40227: Sent initial data (154 bytes) 8849 1726773030.42628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpboh8ohxu /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/AnsiballZ_systemd.py <<< 8849 1726773030.44488: stderr chunk (state=3): >>><<< 8849 1726773030.44495: stdout chunk (state=3): >>><<< 8849 1726773030.44524: done transferring module to remote 8849 1726773030.44541: _low_level_execute_command(): starting 8849 1726773030.44545: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/ /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/AnsiballZ_systemd.py && sleep 0' 8849 1726773030.47199: stderr chunk (state=2): >>><<< 8849 1726773030.47215: stdout chunk (state=2): >>><<< 8849 1726773030.47237: _low_level_execute_command() done: rc=0, stdout=, stderr= 8849 1726773030.47241: _low_level_execute_command(): starting 8849 1726773030.47247: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/AnsiballZ_systemd.py && sleep 0' 8849 1726773030.72652: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8091", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15015936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<< 8849 1726773030.72677: stdout chunk (state=3): >>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 8849 1726773030.74106: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8849 1726773030.74156: stderr chunk (state=3): >>><<< 8849 1726773030.74162: stdout chunk (state=3): >>><<< 8849 1726773030.74181: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8091", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15015936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8849 1726773030.74337: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8849 1726773030.74357: _low_level_execute_command(): starting 8849 1726773030.74365: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773030.3683207-8849-146348559611586/ > /dev/null 2>&1 && sleep 0' 8849 1726773030.77566: stderr chunk (state=2): >>><<< 8849 1726773030.77577: stdout chunk (state=2): >>><<< 8849 1726773030.77599: _low_level_execute_command() done: rc=0, stdout=, stderr= 8849 1726773030.77607: handler run complete 8849 1726773030.77615: attempt loop complete, returning result 8849 1726773030.77688: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8849 1726773030.77695: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 8849 1726773030.77698: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 8849 1726773030.77700: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 8849 1726773030.77702: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 8849 1726773030.77708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.77712: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 8849 1726773030.77714: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8849 1726773030.77716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8849 1726773030.77748: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8849 1726773030.77753: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8849 1726773030.77757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8849 1726773030.77940: dumping result to json 8849 1726773030.77962: done dumping result, returning 8849 1726773030.77976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-0000000000b1] 8849 1726773030.77988: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b1 8849 1726773030.77992: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b1 8849 1726773030.77994: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "ActiveState": "active", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "ConfigurationDirectoryMode": "0755", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8091", "MemoryAccounting": "yes", "MemoryCurrent": "15015936", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "WatchdogUSec": "0" } } 8119 1726773030.78653: no more pending results, returning what we have 8119 1726773030.78659: results queue empty 8119 1726773030.78661: checking for any_errors_fatal 8119 1726773030.78664: done checking for any_errors_fatal 8119 1726773030.78665: checking for max_fail_percentage 8119 1726773030.78667: done checking for max_fail_percentage 8119 1726773030.78669: checking to see if all hosts have failed and the running result is not ok 8119 1726773030.78670: done checking to see if all hosts have failed 8119 1726773030.78671: getting the remaining hosts for this loop 8119 1726773030.78673: done getting the remaining hosts for this loop 8119 1726773030.78678: building list of next tasks for hosts 8119 1726773030.78680: getting the next task for host managed_node2 8119 1726773030.78687: done getting next task for host managed_node2 8119 1726773030.78690: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773030.78693: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.78695: done building task lists 8119 1726773030.78696: counting tasks in each state of execution 8119 1726773030.78699: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773030.78700: advancing hosts in ITERATING_TASKS 8119 1726773030.78702: starting to advance hosts 8119 1726773030.78703: getting the next task for host managed_node2 8119 1726773030.78706: done getting next task for host managed_node2 8119 1726773030.78707: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773030.78710: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773030.78711: done advancing hosts to next task 8119 1726773030.78723: getting variables 8119 1726773030.78725: in VariableManager get_vars() 8119 1726773030.78753: Calling all_inventory to load vars for managed_node2 8119 1726773030.78759: Calling groups_inventory to load vars for managed_node2 8119 1726773030.78762: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773030.78787: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.78804: Calling all_plugins_play to load vars for managed_node2 8119 1726773030.78819: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.78829: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773030.78839: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.78846: Calling groups_plugins_play to load vars for managed_node2 8119 1726773030.78855: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.78893: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.78925: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773030.79207: done with get_vars() 8119 1726773030.79221: done getting variables 8119 1726773030.79227: sending task start callback, copying the task so we can template it temporarily 8119 1726773030.79230: done copying, going to template now 8119 1726773030.79233: done templating 8119 1726773030.79235: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.490) 0:00:25.348 **** 8119 1726773030.79258: sending task start callback 8119 1726773030.79261: entering _queue_task() for managed_node2/file 8119 1726773030.79513: worker is 1 (out of 1 available) 8119 1726773030.79560: exiting _queue_task() for managed_node2/file 8119 1726773030.79631: done queuing things up, now waiting for results queue to drain 8119 1726773030.79636: waiting for pending results... 8886 1726773030.79849: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8886 1726773030.79921: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b2 8886 1726773030.79989: calling self._execute() 8886 1726773030.82540: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8886 1726773030.82663: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8886 1726773030.82737: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8886 1726773030.82779: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8886 1726773030.82821: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8886 1726773030.82863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8886 1726773030.82972: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8886 1726773030.83010: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8886 1726773030.83036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8886 1726773030.83162: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8886 1726773030.83187: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8886 1726773030.83207: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8886 1726773030.83511: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8886 1726773030.83570: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8886 1726773030.83585: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8886 1726773030.83601: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8886 1726773030.83607: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8886 1726773030.83709: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8886 1726773030.83727: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8886 1726773030.83755: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8886 1726773030.83770: starting attempt loop 8886 1726773030.83772: running the handler 8886 1726773030.83781: _low_level_execute_command(): starting 8886 1726773030.83789: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8886 1726773030.86618: stdout chunk (state=2): >>>/root <<< 8886 1726773030.86794: stderr chunk (state=3): >>><<< 8886 1726773030.86802: stdout chunk (state=3): >>><<< 8886 1726773030.86844: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8886 1726773030.86870: _low_level_execute_command(): starting 8886 1726773030.86878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427 `" && echo ansible-tmp-1726773030.8686233-8886-108691818113427="` echo /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427 `" ) && sleep 0' 8886 1726773030.89913: stdout chunk (state=2): >>>ansible-tmp-1726773030.8686233-8886-108691818113427=/root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427 <<< 8886 1726773030.90113: stderr chunk (state=3): >>><<< 8886 1726773030.90121: stdout chunk (state=3): >>><<< 8886 1726773030.90155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773030.8686233-8886-108691818113427=/root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427 , stderr= 8886 1726773030.90252: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 8886 1726773030.90317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/AnsiballZ_file.py 8886 1726773030.90673: Sending initial data 8886 1726773030.90697: Sent initial data (151 bytes) 8886 1726773030.93520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqgfnxeul /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/AnsiballZ_file.py <<< 8886 1726773030.94763: stderr chunk (state=3): >>><<< 8886 1726773030.94770: stdout chunk (state=3): >>><<< 8886 1726773030.94794: done transferring module to remote 8886 1726773030.94808: _low_level_execute_command(): starting 8886 1726773030.94815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/ /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/AnsiballZ_file.py && sleep 0' 8886 1726773030.97411: stderr chunk (state=2): >>><<< 8886 1726773030.97422: stdout chunk (state=2): >>><<< 8886 1726773030.97440: _low_level_execute_command() done: rc=0, stdout=, stderr= 8886 1726773030.97443: _low_level_execute_command(): starting 8886 1726773030.97449: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/AnsiballZ_file.py && sleep 0' 8886 1726773031.12903: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 8886 1726773031.13851: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8886 1726773031.13860: stdout chunk (state=3): >>><<< 8886 1726773031.13871: stderr chunk (state=3): >>><<< 8886 1726773031.13893: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8886 1726773031.13939: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8886 1726773031.13957: _low_level_execute_command(): starting 8886 1726773031.13964: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773030.8686233-8886-108691818113427/ > /dev/null 2>&1 && sleep 0' 8886 1726773031.17096: stderr chunk (state=2): >>><<< 8886 1726773031.17111: stdout chunk (state=2): >>><<< 8886 1726773031.17138: _low_level_execute_command() done: rc=0, stdout=, stderr= 8886 1726773031.17147: handler run complete 8886 1726773031.17155: attempt loop complete, returning result 8886 1726773031.17172: _execute() done 8886 1726773031.17175: dumping result to json 8886 1726773031.17184: done dumping result, returning 8886 1726773031.17204: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-0000000000b2] 8886 1726773031.17223: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b2 8886 1726773031.17301: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b2 8886 1726773031.17306: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8119 1726773031.17828: no more pending results, returning what we have 8119 1726773031.17833: results queue empty 8119 1726773031.17835: checking for any_errors_fatal 8119 1726773031.17844: done checking for any_errors_fatal 8119 1726773031.17847: checking for max_fail_percentage 8119 1726773031.17850: done checking for max_fail_percentage 8119 1726773031.17852: checking to see if all hosts have failed and the running result is not ok 8119 1726773031.17854: done checking to see if all hosts have failed 8119 1726773031.17856: getting the remaining hosts for this loop 8119 1726773031.17859: done getting the remaining hosts for this loop 8119 1726773031.17864: building list of next tasks for hosts 8119 1726773031.17866: getting the next task for host managed_node2 8119 1726773031.17871: done getting next task for host managed_node2 8119 1726773031.17874: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773031.17877: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.17878: done building task lists 8119 1726773031.17880: counting tasks in each state of execution 8119 1726773031.17884: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773031.17886: advancing hosts in ITERATING_TASKS 8119 1726773031.17888: starting to advance hosts 8119 1726773031.17889: getting the next task for host managed_node2 8119 1726773031.17892: done getting next task for host managed_node2 8119 1726773031.17894: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773031.17896: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.17897: done advancing hosts to next task 8119 1726773031.17911: getting variables 8119 1726773031.17913: in VariableManager get_vars() 8119 1726773031.17942: Calling all_inventory to load vars for managed_node2 8119 1726773031.17945: Calling groups_inventory to load vars for managed_node2 8119 1726773031.17947: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773031.17975: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.17991: Calling all_plugins_play to load vars for managed_node2 8119 1726773031.18006: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.18018: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773031.18029: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.18036: Calling groups_plugins_play to load vars for managed_node2 8119 1726773031.18045: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.18063: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.18084: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.18300: done with get_vars() 8119 1726773031.18317: done getting variables 8119 1726773031.18323: sending task start callback, copying the task so we can template it temporarily 8119 1726773031.18325: done copying, going to template now 8119 1726773031.18327: done templating 8119 1726773031.18328: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.390) 0:00:25.739 **** 8119 1726773031.18344: sending task start callback 8119 1726773031.18346: entering _queue_task() for managed_node2/slurp 8119 1726773031.18485: worker is 1 (out of 1 available) 8119 1726773031.18528: exiting _queue_task() for managed_node2/slurp 8119 1726773031.18603: done queuing things up, now waiting for results queue to drain 8119 1726773031.18608: waiting for pending results... 8909 1726773031.18662: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8909 1726773031.18720: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b3 8909 1726773031.18767: calling self._execute() 8909 1726773031.20856: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8909 1726773031.20969: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8909 1726773031.21044: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8909 1726773031.21080: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8909 1726773031.21125: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8909 1726773031.21165: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8909 1726773031.21223: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8909 1726773031.21247: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8909 1726773031.21263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8909 1726773031.21363: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8909 1726773031.21381: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8909 1726773031.21399: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8909 1726773031.21648: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8909 1726773031.21689: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8909 1726773031.21701: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8909 1726773031.21717: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8909 1726773031.21724: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8909 1726773031.21827: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8909 1726773031.21841: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8909 1726773031.21867: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 8909 1726773031.21882: starting attempt loop 8909 1726773031.21887: running the handler 8909 1726773031.21898: _low_level_execute_command(): starting 8909 1726773031.21903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8909 1726773031.24601: stdout chunk (state=2): >>>/root <<< 8909 1726773031.24656: stderr chunk (state=3): >>><<< 8909 1726773031.24661: stdout chunk (state=3): >>><<< 8909 1726773031.24680: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8909 1726773031.24704: _low_level_execute_command(): starting 8909 1726773031.24716: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075 `" && echo ansible-tmp-1726773031.246925-8909-92866202363075="` echo /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075 `" ) && sleep 0' 8909 1726773031.27458: stdout chunk (state=2): >>>ansible-tmp-1726773031.246925-8909-92866202363075=/root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075 <<< 8909 1726773031.27503: stderr chunk (state=3): >>><<< 8909 1726773031.27514: stdout chunk (state=3): >>><<< 8909 1726773031.27538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.246925-8909-92866202363075=/root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075 , stderr= 8909 1726773031.27633: ANSIBALLZ: Using lock for slurp 8909 1726773031.27640: ANSIBALLZ: Acquiring lock 8909 1726773031.27645: ANSIBALLZ: Lock acquired: 140408695168928 8909 1726773031.27652: ANSIBALLZ: Creating module 8909 1726773031.37596: ANSIBALLZ: Writing module into payload 8909 1726773031.37654: ANSIBALLZ: Writing module 8909 1726773031.37674: ANSIBALLZ: Renaming module 8909 1726773031.37679: ANSIBALLZ: Done creating module 8909 1726773031.37706: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/AnsiballZ_slurp.py 8909 1726773031.38076: Sending initial data 8909 1726773031.38095: Sent initial data (150 bytes) 8909 1726773031.40697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpic_i8m0i /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/AnsiballZ_slurp.py <<< 8909 1726773031.41685: stderr chunk (state=3): >>><<< 8909 1726773031.41693: stdout chunk (state=3): >>><<< 8909 1726773031.41719: done transferring module to remote 8909 1726773031.41733: _low_level_execute_command(): starting 8909 1726773031.41738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/ /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/AnsiballZ_slurp.py && sleep 0' 8909 1726773031.44386: stderr chunk (state=2): >>><<< 8909 1726773031.44398: stdout chunk (state=2): >>><<< 8909 1726773031.44418: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773031.44422: _low_level_execute_command(): starting 8909 1726773031.44429: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/AnsiballZ_slurp.py && sleep 0' 8909 1726773031.58739: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8909 1726773031.59681: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8909 1726773031.59779: stderr chunk (state=3): >>><<< 8909 1726773031.59789: stdout chunk (state=3): >>><<< 8909 1726773031.59820: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 8909 1726773031.59855: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8909 1726773031.59873: _low_level_execute_command(): starting 8909 1726773031.59881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.246925-8909-92866202363075/ > /dev/null 2>&1 && sleep 0' 8909 1726773031.63078: stderr chunk (state=2): >>><<< 8909 1726773031.63091: stdout chunk (state=2): >>><<< 8909 1726773031.63116: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773031.63125: handler run complete 8909 1726773031.63150: attempt loop complete, returning result 8909 1726773031.63162: _execute() done 8909 1726773031.63164: dumping result to json 8909 1726773031.63167: done dumping result, returning 8909 1726773031.63179: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [12a3200b-1e9d-1dbd-cc52-0000000000b3] 8909 1726773031.63194: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b3 8909 1726773031.63231: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b3 8909 1726773031.63236: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773031.63460: no more pending results, returning what we have 8119 1726773031.63466: results queue empty 8119 1726773031.63468: checking for any_errors_fatal 8119 1726773031.63474: done checking for any_errors_fatal 8119 1726773031.63476: checking for max_fail_percentage 8119 1726773031.63479: done checking for max_fail_percentage 8119 1726773031.63481: checking to see if all hosts have failed and the running result is not ok 8119 1726773031.63485: done checking to see if all hosts have failed 8119 1726773031.63487: getting the remaining hosts for this loop 8119 1726773031.63490: done getting the remaining hosts for this loop 8119 1726773031.63499: building list of next tasks for hosts 8119 1726773031.63501: getting the next task for host managed_node2 8119 1726773031.63508: done getting next task for host managed_node2 8119 1726773031.63512: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773031.63517: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.63520: done building task lists 8119 1726773031.63522: counting tasks in each state of execution 8119 1726773031.63526: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773031.63528: advancing hosts in ITERATING_TASKS 8119 1726773031.63530: starting to advance hosts 8119 1726773031.63532: getting the next task for host managed_node2 8119 1726773031.63536: done getting next task for host managed_node2 8119 1726773031.63539: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773031.63542: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.63543: done advancing hosts to next task 8119 1726773031.63558: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773031.63562: getting variables 8119 1726773031.63565: in VariableManager get_vars() 8119 1726773031.63602: Calling all_inventory to load vars for managed_node2 8119 1726773031.63610: Calling groups_inventory to load vars for managed_node2 8119 1726773031.63614: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773031.63637: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63648: Calling all_plugins_play to load vars for managed_node2 8119 1726773031.63658: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63666: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773031.63677: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63685: Calling groups_plugins_play to load vars for managed_node2 8119 1726773031.63700: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63722: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63739: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.63955: done with get_vars() 8119 1726773031.63965: done getting variables 8119 1726773031.63970: sending task start callback, copying the task so we can template it temporarily 8119 1726773031.63971: done copying, going to template now 8119 1726773031.63973: done templating 8119 1726773031.63974: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.456) 0:00:26.196 **** 8119 1726773031.63993: sending task start callback 8119 1726773031.63995: entering _queue_task() for managed_node2/set_fact 8119 1726773031.64115: worker is 1 (out of 1 available) 8119 1726773031.64154: exiting _queue_task() for managed_node2/set_fact 8119 1726773031.64228: done queuing things up, now waiting for results queue to drain 8119 1726773031.64233: waiting for pending results... 8937 1726773031.64300: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8937 1726773031.64370: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b4 8937 1726773031.64427: calling self._execute() 8937 1726773031.66960: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8937 1726773031.67068: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8937 1726773031.67132: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8937 1726773031.67162: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8937 1726773031.67192: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8937 1726773031.67247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8937 1726773031.67299: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8937 1726773031.67330: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8937 1726773031.67348: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8937 1726773031.67446: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8937 1726773031.67468: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8937 1726773031.67488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8937 1726773031.67943: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8937 1726773031.67996: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8937 1726773031.68014: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8937 1726773031.68031: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8937 1726773031.68040: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8937 1726773031.68167: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8937 1726773031.68188: starting attempt loop 8937 1726773031.68191: running the handler 8937 1726773031.68204: handler run complete 8937 1726773031.68208: attempt loop complete, returning result 8937 1726773031.68211: _execute() done 8937 1726773031.68213: dumping result to json 8937 1726773031.68215: done dumping result, returning 8937 1726773031.68220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [12a3200b-1e9d-1dbd-cc52-0000000000b4] 8937 1726773031.68226: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b4 8937 1726773031.68251: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b4 8937 1726773031.68255: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8119 1726773031.68480: no more pending results, returning what we have 8119 1726773031.68485: results queue empty 8119 1726773031.68487: checking for any_errors_fatal 8119 1726773031.68493: done checking for any_errors_fatal 8119 1726773031.68495: checking for max_fail_percentage 8119 1726773031.68498: done checking for max_fail_percentage 8119 1726773031.68499: checking to see if all hosts have failed and the running result is not ok 8119 1726773031.68500: done checking to see if all hosts have failed 8119 1726773031.68502: getting the remaining hosts for this loop 8119 1726773031.68504: done getting the remaining hosts for this loop 8119 1726773031.68509: building list of next tasks for hosts 8119 1726773031.68512: getting the next task for host managed_node2 8119 1726773031.68517: done getting next task for host managed_node2 8119 1726773031.68520: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773031.68523: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.68525: done building task lists 8119 1726773031.68526: counting tasks in each state of execution 8119 1726773031.68529: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773031.68530: advancing hosts in ITERATING_TASKS 8119 1726773031.68532: starting to advance hosts 8119 1726773031.68533: getting the next task for host managed_node2 8119 1726773031.68535: done getting next task for host managed_node2 8119 1726773031.68537: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773031.68539: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773031.68541: done advancing hosts to next task 8119 1726773031.68552: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773031.68555: getting variables 8119 1726773031.68557: in VariableManager get_vars() 8119 1726773031.68585: Calling all_inventory to load vars for managed_node2 8119 1726773031.68591: Calling groups_inventory to load vars for managed_node2 8119 1726773031.68594: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773031.68617: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68630: Calling all_plugins_play to load vars for managed_node2 8119 1726773031.68642: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68651: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773031.68662: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68668: Calling groups_plugins_play to load vars for managed_node2 8119 1726773031.68678: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68700: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68721: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773031.68941: done with get_vars() 8119 1726773031.68952: done getting variables 8119 1726773031.68957: sending task start callback, copying the task so we can template it temporarily 8119 1726773031.68959: done copying, going to template now 8119 1726773031.68961: done templating 8119 1726773031.68962: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.049) 0:00:26.246 **** 8119 1726773031.68977: sending task start callback 8119 1726773031.68979: entering _queue_task() for managed_node2/copy 8119 1726773031.69102: worker is 1 (out of 1 available) 8119 1726773031.69140: exiting _queue_task() for managed_node2/copy 8119 1726773031.69211: done queuing things up, now waiting for results queue to drain 8119 1726773031.69216: waiting for pending results... 8940 1726773031.69276: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8940 1726773031.69332: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b5 8940 1726773031.69378: calling self._execute() 8940 1726773031.71518: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8940 1726773031.71636: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8940 1726773031.71711: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8940 1726773031.71750: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8940 1726773031.71793: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8940 1726773031.71835: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8940 1726773031.71890: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8940 1726773031.71923: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8940 1726773031.71965: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8940 1726773031.72074: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8940 1726773031.72105: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8940 1726773031.72131: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8940 1726773031.72503: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8940 1726773031.72557: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8940 1726773031.72572: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8940 1726773031.72590: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8940 1726773031.72598: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8940 1726773031.72731: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8940 1726773031.72749: starting attempt loop 8940 1726773031.72753: running the handler 8940 1726773031.72764: _low_level_execute_command(): starting 8940 1726773031.72771: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8940 1726773031.75334: stdout chunk (state=2): >>>/root <<< 8940 1726773031.75847: stderr chunk (state=3): >>><<< 8940 1726773031.75855: stdout chunk (state=3): >>><<< 8940 1726773031.75885: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8940 1726773031.75906: _low_level_execute_command(): starting 8940 1726773031.75918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486 `" && echo ansible-tmp-1726773031.7589834-8940-239113968031486="` echo /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486 `" ) && sleep 0' 8940 1726773031.79613: stdout chunk (state=2): >>>ansible-tmp-1726773031.7589834-8940-239113968031486=/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486 <<< 8940 1726773031.79757: stderr chunk (state=3): >>><<< 8940 1726773031.79763: stdout chunk (state=3): >>><<< 8940 1726773031.79784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.7589834-8940-239113968031486=/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486 , stderr= 8940 1726773031.79928: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8940 1726773031.79988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_stat.py 8940 1726773031.80732: Sending initial data 8940 1726773031.80745: Sent initial data (151 bytes) 8940 1726773031.83178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptebyvy3z /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_stat.py <<< 8940 1726773031.84537: stderr chunk (state=3): >>><<< 8940 1726773031.84545: stdout chunk (state=3): >>><<< 8940 1726773031.84574: done transferring module to remote 8940 1726773031.84594: _low_level_execute_command(): starting 8940 1726773031.84601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/ /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_stat.py && sleep 0' 8940 1726773031.87484: stderr chunk (state=2): >>><<< 8940 1726773031.87502: stdout chunk (state=2): >>><<< 8940 1726773031.87531: _low_level_execute_command() done: rc=0, stdout=, stderr= 8940 1726773031.87537: _low_level_execute_command(): starting 8940 1726773031.87546: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_stat.py && sleep 0' 8940 1726773032.03216: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773031.5853307, "mtime": 1726773024.963343, "ctime": 1726773024.963343, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 8940 1726773032.04849: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8940 1726773032.04859: stdout chunk (state=3): >>><<< 8940 1726773032.04870: stderr chunk (state=3): >>><<< 8940 1726773032.04891: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773031.5853307, "mtime": 1726773024.963343, "ctime": 1726773024.963343, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 8940 1726773032.04968: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8940 1726773032.05659: Sending initial data 8940 1726773032.05673: Sent initial data (140 bytes) 8940 1726773032.08293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmps5m68vai /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source <<< 8940 1726773032.09391: stderr chunk (state=3): >>><<< 8940 1726773032.09399: stdout chunk (state=3): >>><<< 8940 1726773032.09434: _low_level_execute_command(): starting 8940 1726773032.09442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/ /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source && sleep 0' 8940 1726773032.12917: stderr chunk (state=2): >>><<< 8940 1726773032.12936: stdout chunk (state=2): >>><<< 8940 1726773032.12961: _low_level_execute_command() done: rc=0, stdout=, stderr= 8940 1726773032.13100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 8940 1726773032.13152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_copy.py 8940 1726773032.13710: Sending initial data 8940 1726773032.13724: Sent initial data (151 bytes) 8940 1726773032.16194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8hlwfmxr /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_copy.py <<< 8940 1726773032.17284: stderr chunk (state=3): >>><<< 8940 1726773032.17290: stdout chunk (state=3): >>><<< 8940 1726773032.17310: done transferring module to remote 8940 1726773032.17322: _low_level_execute_command(): starting 8940 1726773032.17327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/ /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_copy.py && sleep 0' 8940 1726773032.19881: stderr chunk (state=2): >>><<< 8940 1726773032.19894: stdout chunk (state=2): >>><<< 8940 1726773032.19913: _low_level_execute_command() done: rc=0, stdout=, stderr= 8940 1726773032.19918: _low_level_execute_command(): starting 8940 1726773032.19926: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/AnsiballZ_copy.py && sleep 0' 8940 1726773032.35384: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source", "_original_basename": "tmps5m68vai", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 8940 1726773032.36437: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8940 1726773032.36489: stderr chunk (state=3): >>><<< 8940 1726773032.36497: stdout chunk (state=3): >>><<< 8940 1726773032.36519: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source", "_original_basename": "tmps5m68vai", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8940 1726773032.36549: done with _execute_module (copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source', '_original_basename': 'tmps5m68vai', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8940 1726773032.36561: _low_level_execute_command(): starting 8940 1726773032.36566: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/ > /dev/null 2>&1 && sleep 0' 8940 1726773032.39232: stderr chunk (state=2): >>><<< 8940 1726773032.39243: stdout chunk (state=2): >>><<< 8940 1726773032.39264: _low_level_execute_command() done: rc=0, stdout=, stderr= 8940 1726773032.39275: handler run complete 8940 1726773032.39313: attempt loop complete, returning result 8940 1726773032.39327: _execute() done 8940 1726773032.39331: dumping result to json 8940 1726773032.39336: done dumping result, returning 8940 1726773032.39348: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [12a3200b-1e9d-1dbd-cc52-0000000000b5] 8940 1726773032.39361: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b5 8940 1726773032.39402: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b5 8940 1726773032.39445: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7589834-8940-239113968031486/source", "state": "file", "uid": 0 } 8119 1726773032.39608: no more pending results, returning what we have 8119 1726773032.39616: results queue empty 8119 1726773032.39619: checking for any_errors_fatal 8119 1726773032.39623: done checking for any_errors_fatal 8119 1726773032.39625: checking for max_fail_percentage 8119 1726773032.39628: done checking for max_fail_percentage 8119 1726773032.39630: checking to see if all hosts have failed and the running result is not ok 8119 1726773032.39631: done checking to see if all hosts have failed 8119 1726773032.39633: getting the remaining hosts for this loop 8119 1726773032.39636: done getting the remaining hosts for this loop 8119 1726773032.39644: building list of next tasks for hosts 8119 1726773032.39646: getting the next task for host managed_node2 8119 1726773032.39653: done getting next task for host managed_node2 8119 1726773032.39657: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773032.39661: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773032.39663: done building task lists 8119 1726773032.39665: counting tasks in each state of execution 8119 1726773032.39669: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773032.39671: advancing hosts in ITERATING_TASKS 8119 1726773032.39673: starting to advance hosts 8119 1726773032.39675: getting the next task for host managed_node2 8119 1726773032.39678: done getting next task for host managed_node2 8119 1726773032.39681: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773032.39694: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773032.39697: done advancing hosts to next task 8119 1726773032.39715: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773032.39720: getting variables 8119 1726773032.39723: in VariableManager get_vars() 8119 1726773032.39754: Calling all_inventory to load vars for managed_node2 8119 1726773032.39758: Calling groups_inventory to load vars for managed_node2 8119 1726773032.39761: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773032.39785: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.39797: Calling all_plugins_play to load vars for managed_node2 8119 1726773032.39808: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.39823: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773032.39836: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.39842: Calling groups_plugins_play to load vars for managed_node2 8119 1726773032.39852: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.39870: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.39887: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773032.40093: done with get_vars() 8119 1726773032.40104: done getting variables 8119 1726773032.40111: sending task start callback, copying the task so we can template it temporarily 8119 1726773032.40113: done copying, going to template now 8119 1726773032.40114: done templating 8119 1726773032.40116: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.711) 0:00:26.957 **** 8119 1726773032.40131: sending task start callback 8119 1726773032.40133: entering _queue_task() for managed_node2/copy 8119 1726773032.40250: worker is 1 (out of 1 available) 8119 1726773032.40289: exiting _queue_task() for managed_node2/copy 8119 1726773032.40365: done queuing things up, now waiting for results queue to drain 8119 1726773032.40370: waiting for pending results... 8995 1726773032.40422: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8995 1726773032.40471: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b6 8995 1726773032.40520: calling self._execute() 8995 1726773032.42606: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 8995 1726773032.42691: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 8995 1726773032.42740: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 8995 1726773032.42778: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 8995 1726773032.42811: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 8995 1726773032.42840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 8995 1726773032.42881: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 8995 1726773032.42910: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 8995 1726773032.42929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 8995 1726773032.43019: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 8995 1726773032.43038: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 8995 1726773032.43052: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 8995 1726773032.43274: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 8995 1726773032.43309: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8995 1726773032.43321: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 8995 1726773032.43331: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8995 1726773032.43336: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8995 1726773032.43432: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8995 1726773032.43445: starting attempt loop 8995 1726773032.43447: running the handler 8995 1726773032.43455: _low_level_execute_command(): starting 8995 1726773032.43461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8995 1726773032.46086: stdout chunk (state=2): >>>/root <<< 8995 1726773032.46214: stderr chunk (state=3): >>><<< 8995 1726773032.46221: stdout chunk (state=3): >>><<< 8995 1726773032.46245: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8995 1726773032.46262: _low_level_execute_command(): starting 8995 1726773032.46269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103 `" && echo ansible-tmp-1726773032.462555-8995-263095909667103="` echo /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103 `" ) && sleep 0' 8995 1726773032.49507: stdout chunk (state=2): >>>ansible-tmp-1726773032.462555-8995-263095909667103=/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103 <<< 8995 1726773032.49644: stderr chunk (state=3): >>><<< 8995 1726773032.49651: stdout chunk (state=3): >>><<< 8995 1726773032.49674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773032.462555-8995-263095909667103=/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103 , stderr= 8995 1726773032.49836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 8995 1726773032.49900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_stat.py 8995 1726773032.50643: Sending initial data 8995 1726773032.50656: Sent initial data (150 bytes) 8995 1726773032.53305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8lqejowe /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_stat.py <<< 8995 1726773032.54831: stderr chunk (state=3): >>><<< 8995 1726773032.54839: stdout chunk (state=3): >>><<< 8995 1726773032.54869: done transferring module to remote 8995 1726773032.54891: _low_level_execute_command(): starting 8995 1726773032.54898: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/ /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_stat.py && sleep 0' 8995 1726773032.58563: stderr chunk (state=2): >>><<< 8995 1726773032.58578: stdout chunk (state=2): >>><<< 8995 1726773032.58607: _low_level_execute_command() done: rc=0, stdout=, stderr= 8995 1726773032.58616: _low_level_execute_command(): starting 8995 1726773032.58624: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_stat.py && sleep 0' 8995 1726773032.74306: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726773024.8333433, "mtime": 1726773024.963343, "ctime": 1726773024.963343, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 8995 1726773032.75259: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8995 1726773032.75269: stdout chunk (state=3): >>><<< 8995 1726773032.75281: stderr chunk (state=3): >>><<< 8995 1726773032.75303: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726773024.8333433, "mtime": 1726773024.963343, "ctime": 1726773024.963343, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 8995 1726773032.75388: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8995 1726773032.77203: Sending initial data 8995 1726773032.77221: Sent initial data (139 bytes) 8995 1726773032.79800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqy_e6zsz /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source <<< 8995 1726773032.80614: stderr chunk (state=3): >>><<< 8995 1726773032.80622: stdout chunk (state=3): >>><<< 8995 1726773032.80657: _low_level_execute_command(): starting 8995 1726773032.80665: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/ /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source && sleep 0' 8995 1726773032.83565: stderr chunk (state=2): >>><<< 8995 1726773032.83580: stdout chunk (state=2): >>><<< 8995 1726773032.83609: _low_level_execute_command() done: rc=0, stdout=, stderr= 8995 1726773032.83746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 8995 1726773032.83809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_copy.py 8995 1726773032.85178: Sending initial data 8995 1726773032.85196: Sent initial data (150 bytes) 8995 1726773032.88354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp2fw4kh_v /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_copy.py <<< 8995 1726773032.89824: stderr chunk (state=3): >>><<< 8995 1726773032.89832: stdout chunk (state=3): >>><<< 8995 1726773032.89861: done transferring module to remote 8995 1726773032.89879: _low_level_execute_command(): starting 8995 1726773032.89888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/ /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_copy.py && sleep 0' 8995 1726773032.93422: stderr chunk (state=2): >>><<< 8995 1726773032.93436: stdout chunk (state=2): >>><<< 8995 1726773032.93459: _low_level_execute_command() done: rc=0, stdout=, stderr= 8995 1726773032.93465: _low_level_execute_command(): starting 8995 1726773032.93473: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/AnsiballZ_copy.py && sleep 0' 8995 1726773033.09813: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source", "_original_basename": "tmpqy_e6zsz", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 8995 1726773033.10976: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 8995 1726773033.10989: stdout chunk (state=3): >>><<< 8995 1726773033.11000: stderr chunk (state=3): >>><<< 8995 1726773033.11020: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source", "_original_basename": "tmpqy_e6zsz", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 8995 1726773033.11063: done with _execute_module (copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source', '_original_basename': 'tmpqy_e6zsz', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8995 1726773033.11084: _low_level_execute_command(): starting 8995 1726773033.11093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/ > /dev/null 2>&1 && sleep 0' 8995 1726773033.16394: stderr chunk (state=2): >>><<< 8995 1726773033.16408: stdout chunk (state=2): >>><<< 8995 1726773033.16433: _low_level_execute_command() done: rc=0, stdout=, stderr= 8995 1726773033.16447: handler run complete 8995 1726773033.16496: attempt loop complete, returning result 8995 1726773033.16513: _execute() done 8995 1726773033.16517: dumping result to json 8995 1726773033.16524: done dumping result, returning 8995 1726773033.16540: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [12a3200b-1e9d-1dbd-cc52-0000000000b6] 8995 1726773033.16557: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b6 8995 1726773033.17699: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b6 8995 1726773033.17705: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726773032.462555-8995-263095909667103/source", "state": "file", "uid": 0 } 8119 1726773033.18741: no more pending results, returning what we have 8119 1726773033.18748: results queue empty 8119 1726773033.18750: checking for any_errors_fatal 8119 1726773033.18756: done checking for any_errors_fatal 8119 1726773033.18758: checking for max_fail_percentage 8119 1726773033.18762: done checking for max_fail_percentage 8119 1726773033.18764: checking to see if all hosts have failed and the running result is not ok 8119 1726773033.18766: done checking to see if all hosts have failed 8119 1726773033.18768: getting the remaining hosts for this loop 8119 1726773033.18771: done getting the remaining hosts for this loop 8119 1726773033.18779: building list of next tasks for hosts 8119 1726773033.18784: getting the next task for host managed_node2 8119 1726773033.18793: done getting next task for host managed_node2 8119 1726773033.18797: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773033.18802: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773033.18804: done building task lists 8119 1726773033.18806: counting tasks in each state of execution 8119 1726773033.18813: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773033.18816: advancing hosts in ITERATING_TASKS 8119 1726773033.18818: starting to advance hosts 8119 1726773033.18821: getting the next task for host managed_node2 8119 1726773033.18825: done getting next task for host managed_node2 8119 1726773033.18828: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773033.18832: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773033.18834: done advancing hosts to next task 8119 1726773033.18878: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773033.18888: getting variables 8119 1726773033.18892: in VariableManager get_vars() 8119 1726773033.18930: Calling all_inventory to load vars for managed_node2 8119 1726773033.18937: Calling groups_inventory to load vars for managed_node2 8119 1726773033.18941: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773033.18970: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.18988: Calling all_plugins_play to load vars for managed_node2 8119 1726773033.19011: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.19027: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773033.19046: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.19056: Calling groups_plugins_play to load vars for managed_node2 8119 1726773033.19074: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.19108: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.19138: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.19477: done with get_vars() 8119 1726773033.19494: done getting variables 8119 1726773033.19502: sending task start callback, copying the task so we can template it temporarily 8119 1726773033.19505: done copying, going to template now 8119 1726773033.19508: done templating 8119 1726773033.19512: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.794) 0:00:27.751 **** 8119 1726773033.19537: sending task start callback 8119 1726773033.19540: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773033.19742: worker is 1 (out of 1 available) 8119 1726773033.19786: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773033.19862: done queuing things up, now waiting for results queue to drain 8119 1726773033.19867: waiting for pending results... 9054 1726773033.20102: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 9054 1726773033.20174: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b7 9054 1726773033.20238: calling self._execute() 9054 1726773033.22721: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9054 1726773033.22838: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9054 1726773033.22912: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9054 1726773033.22966: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9054 1726773033.23007: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9054 1726773033.23050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9054 1726773033.23113: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9054 1726773033.23145: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9054 1726773033.23170: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9054 1726773033.23313: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9054 1726773033.23339: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9054 1726773033.23362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9054 1726773033.23756: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9054 1726773033.23806: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9054 1726773033.23825: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9054 1726773033.23839: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9054 1726773033.23845: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9054 1726773033.23952: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 9054 1726773033.23973: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 9054 1726773033.24013: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 9054 1726773033.24031: starting attempt loop 9054 1726773033.24035: running the handler 9054 1726773033.24047: _low_level_execute_command(): starting 9054 1726773033.24053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9054 1726773033.26890: stdout chunk (state=2): >>>/root <<< 9054 1726773033.27024: stderr chunk (state=3): >>><<< 9054 1726773033.27031: stdout chunk (state=3): >>><<< 9054 1726773033.27062: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9054 1726773033.27088: _low_level_execute_command(): starting 9054 1726773033.27097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290 `" && echo ansible-tmp-1726773033.2707858-9054-166175629972290="` echo /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290 `" ) && sleep 0' 9054 1726773033.30441: stdout chunk (state=2): >>>ansible-tmp-1726773033.2707858-9054-166175629972290=/root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290 <<< 9054 1726773033.30459: stderr chunk (state=2): >>><<< 9054 1726773033.30471: stdout chunk (state=3): >>><<< 9054 1726773033.30492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.2707858-9054-166175629972290=/root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290 , stderr= 9054 1726773033.30593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 9054 1726773033.30662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/AnsiballZ_kernel_settings_get_config.py 9054 1726773033.31681: Sending initial data 9054 1726773033.31696: Sent initial data (173 bytes) 9054 1726773033.35507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpb3_rdriu /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/AnsiballZ_kernel_settings_get_config.py <<< 9054 1726773033.42815: stderr chunk (state=3): >>><<< 9054 1726773033.42824: stdout chunk (state=3): >>><<< 9054 1726773033.42856: done transferring module to remote 9054 1726773033.42875: _low_level_execute_command(): starting 9054 1726773033.42882: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/ /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9054 1726773033.47176: stderr chunk (state=2): >>><<< 9054 1726773033.47195: stdout chunk (state=2): >>><<< 9054 1726773033.47224: _low_level_execute_command() done: rc=0, stdout=, stderr= 9054 1726773033.47231: _low_level_execute_command(): starting 9054 1726773033.47240: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9054 1726773033.63121: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 9054 1726773033.64226: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9054 1726773033.64237: stdout chunk (state=3): >>><<< 9054 1726773033.64249: stderr chunk (state=3): >>><<< 9054 1726773033.64271: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 9054 1726773033.64320: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9054 1726773033.64340: _low_level_execute_command(): starting 9054 1726773033.64348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.2707858-9054-166175629972290/ > /dev/null 2>&1 && sleep 0' 9054 1726773033.69297: stderr chunk (state=2): >>><<< 9054 1726773033.69319: stdout chunk (state=2): >>><<< 9054 1726773033.69348: _low_level_execute_command() done: rc=0, stdout=, stderr= 9054 1726773033.69358: handler run complete 9054 1726773033.69439: attempt loop complete, returning result 9054 1726773033.69456: _execute() done 9054 1726773033.69459: dumping result to json 9054 1726773033.69464: done dumping result, returning 9054 1726773033.69480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [12a3200b-1e9d-1dbd-cc52-0000000000b7] 9054 1726773033.69500: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b7 9054 1726773033.70202: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b7 9054 1726773033.70212: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530" }, "sysfs": { "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8119 1726773033.70726: no more pending results, returning what we have 8119 1726773033.70734: results queue empty 8119 1726773033.70736: checking for any_errors_fatal 8119 1726773033.70742: done checking for any_errors_fatal 8119 1726773033.70744: checking for max_fail_percentage 8119 1726773033.70747: done checking for max_fail_percentage 8119 1726773033.70750: checking to see if all hosts have failed and the running result is not ok 8119 1726773033.70752: done checking to see if all hosts have failed 8119 1726773033.70754: getting the remaining hosts for this loop 8119 1726773033.70757: done getting the remaining hosts for this loop 8119 1726773033.70765: building list of next tasks for hosts 8119 1726773033.70768: getting the next task for host managed_node2 8119 1726773033.70777: done getting next task for host managed_node2 8119 1726773033.70781: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773033.70788: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773033.70791: done building task lists 8119 1726773033.70794: counting tasks in each state of execution 8119 1726773033.70798: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773033.70801: advancing hosts in ITERATING_TASKS 8119 1726773033.70803: starting to advance hosts 8119 1726773033.70805: getting the next task for host managed_node2 8119 1726773033.70812: done getting next task for host managed_node2 8119 1726773033.70816: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773033.70820: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773033.70822: done advancing hosts to next task 8119 1726773033.70897: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773033.70904: getting variables 8119 1726773033.70907: in VariableManager get_vars() 8119 1726773033.70945: Calling all_inventory to load vars for managed_node2 8119 1726773033.70951: Calling groups_inventory to load vars for managed_node2 8119 1726773033.70955: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773033.70987: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71005: Calling all_plugins_play to load vars for managed_node2 8119 1726773033.71027: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71043: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773033.71062: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71074: Calling groups_plugins_play to load vars for managed_node2 8119 1726773033.71093: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71130: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71155: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773033.71505: done with get_vars() 8119 1726773033.71522: done getting variables 8119 1726773033.71530: sending task start callback, copying the task so we can template it temporarily 8119 1726773033.71534: done copying, going to template now 8119 1726773033.71537: done templating 8119 1726773033.71539: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.520) 0:00:28.272 **** 8119 1726773033.71570: sending task start callback 8119 1726773033.71573: entering _queue_task() for managed_node2/template 8119 1726773033.71577: Creating lock for template 8119 1726773033.71868: worker is 1 (out of 1 available) 8119 1726773033.71907: exiting _queue_task() for managed_node2/template 8119 1726773033.71980: done queuing things up, now waiting for results queue to drain 8119 1726773033.71987: waiting for pending results... 9105 1726773033.72230: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9105 1726773033.72298: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b8 9105 1726773033.72357: calling self._execute() 9105 1726773033.74958: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9105 1726773033.75081: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9105 1726773033.75170: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9105 1726773033.75213: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9105 1726773033.75254: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9105 1726773033.75354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9105 1726773033.75426: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9105 1726773033.75459: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9105 1726773033.75486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9105 1726773033.75604: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9105 1726773033.75632: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9105 1726773033.75656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9105 1726773033.76202: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9105 1726773033.76257: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9105 1726773033.76271: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9105 1726773033.76289: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9105 1726773033.76298: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9105 1726773033.76588: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9105 1726773033.76611: starting attempt loop 9105 1726773033.76615: running the handler 9105 1726773033.76627: _low_level_execute_command(): starting 9105 1726773033.76634: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9105 1726773033.82238: stdout chunk (state=2): >>>/root <<< 9105 1726773033.82261: stderr chunk (state=2): >>><<< 9105 1726773033.82276: stdout chunk (state=3): >>><<< 9105 1726773033.82299: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9105 1726773033.82320: _low_level_execute_command(): starting 9105 1726773033.82329: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138 `" && echo ansible-tmp-1726773033.8231146-9105-270043369574138="` echo /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138 `" ) && sleep 0' 9105 1726773033.87066: stdout chunk (state=2): >>>ansible-tmp-1726773033.8231146-9105-270043369574138=/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138 <<< 9105 1726773033.87082: stderr chunk (state=2): >>><<< 9105 1726773033.87096: stdout chunk (state=3): >>><<< 9105 1726773033.87114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.8231146-9105-270043369574138=/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138 , stderr= 9105 1726773033.87145: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 9105 1726773033.87172: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 9105 1726773033.90164: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90174: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.90179: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.90182: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.90189: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.90192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.90195: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.90198: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.90201: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.90229: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90234: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.90238: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.90627: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90634: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.90638: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.90641: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.90644: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.90648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.90651: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.90654: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.90657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.90680: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90686: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.90690: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.90736: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90741: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.90745: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.90748: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.90752: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.90755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.90758: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.90761: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.90764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.90787: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.90792: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.90795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91081: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91092: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.91095: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.91099: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.91102: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.91107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91110: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.91113: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.91117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.91140: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91144: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.91147: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91514: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91520: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.91524: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.91528: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.91531: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.91534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91537: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.91540: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.91543: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.91565: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91569: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.91572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91615: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91620: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9105 1726773033.91624: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9105 1726773033.91627: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9105 1726773033.91630: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9105 1726773033.91633: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.91636: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9105 1726773033.91639: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9105 1726773033.91642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9105 1726773033.91662: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9105 1726773033.91667: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9105 1726773033.91670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9105 1726773033.93368: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9105 1726773033.93477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 9105 1726773033.93536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_stat.py 9105 1726773033.94635: Sending initial data 9105 1726773033.94648: Sent initial data (151 bytes) 9105 1726773033.97655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpr2sk0_ov /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_stat.py <<< 9105 1726773033.99323: stderr chunk (state=3): >>><<< 9105 1726773033.99333: stdout chunk (state=3): >>><<< 9105 1726773033.99364: done transferring module to remote 9105 1726773033.99388: _low_level_execute_command(): starting 9105 1726773033.99398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/ /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_stat.py && sleep 0' 9105 1726773034.02572: stderr chunk (state=2): >>><<< 9105 1726773034.02590: stdout chunk (state=2): >>><<< 9105 1726773034.02621: _low_level_execute_command() done: rc=0, stdout=, stderr= 9105 1726773034.02628: _low_level_execute_command(): starting 9105 1726773034.02637: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_stat.py && sleep 0' 9105 1726773034.18466: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 257949890, "dev": 51713, "nlink": 1, "atime": 1726773033.628327, "mtime": 1726773023.380346, "ctime": 1726773023.7793453, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "260595774", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 9105 1726773034.19557: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9105 1726773034.19566: stdout chunk (state=3): >>><<< 9105 1726773034.19577: stderr chunk (state=3): >>><<< 9105 1726773034.19599: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 257949890, "dev": 51713, "nlink": 1, "atime": 1726773033.628327, "mtime": 1726773023.380346, "ctime": 1726773023.7793453, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "260595774", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 9105 1726773034.19725: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9105 1726773034.20486: Sending initial data 9105 1726773034.20501: Sent initial data (159 bytes) 9105 1726773034.22920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpsxlso0q5/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source <<< 9105 1726773034.23463: stderr chunk (state=3): >>><<< 9105 1726773034.23470: stdout chunk (state=3): >>><<< 9105 1726773034.23502: _low_level_execute_command(): starting 9105 1726773034.23515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/ /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source && sleep 0' 9105 1726773034.26534: stderr chunk (state=2): >>><<< 9105 1726773034.26546: stdout chunk (state=2): >>><<< 9105 1726773034.26564: _low_level_execute_command() done: rc=0, stdout=, stderr= 9105 1726773034.26674: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 9105 1726773034.26731: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_copy.py 9105 1726773034.27023: Sending initial data 9105 1726773034.27039: Sent initial data (151 bytes) 9105 1726773034.29464: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpdrfexgv5 /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_copy.py <<< 9105 1726773034.30481: stderr chunk (state=3): >>><<< 9105 1726773034.30488: stdout chunk (state=3): >>><<< 9105 1726773034.30510: done transferring module to remote 9105 1726773034.30522: _low_level_execute_command(): starting 9105 1726773034.30526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/ /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_copy.py && sleep 0' 9105 1726773034.33440: stderr chunk (state=2): >>><<< 9105 1726773034.33458: stdout chunk (state=2): >>><<< 9105 1726773034.33479: _low_level_execute_command() done: rc=0, stdout=, stderr= 9105 1726773034.33482: _low_level_execute_command(): starting 9105 1726773034.33493: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/AnsiballZ_copy.py && sleep 0' 9105 1726773034.49251: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 9105 1726773034.51069: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9105 1726773034.51084: stdout chunk (state=3): >>><<< 9105 1726773034.51098: stderr chunk (state=3): >>><<< 9105 1726773034.51120: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9105 1726773034.51169: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3feaf86b2638623e3300792e683ce55f91f31e9a', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9105 1726773034.51215: _low_level_execute_command(): starting 9105 1726773034.51224: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/ > /dev/null 2>&1 && sleep 0' 9105 1726773034.54273: stderr chunk (state=2): >>><<< 9105 1726773034.54292: stdout chunk (state=2): >>><<< 9105 1726773034.54318: _low_level_execute_command() done: rc=0, stdout=, stderr= 9105 1726773034.54346: handler run complete 9105 1726773034.54391: attempt loop complete, returning result 9105 1726773034.54401: _execute() done 9105 1726773034.54404: dumping result to json 9105 1726773034.54410: done dumping result, returning 9105 1726773034.54426: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [12a3200b-1e9d-1dbd-cc52-0000000000b8] 9105 1726773034.54442: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b8 9105 1726773034.54800: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b8 9105 1726773034.54805: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726773033.8231146-9105-270043369574138/source", "state": "file", "uid": 0 } 8119 1726773034.55340: no more pending results, returning what we have 8119 1726773034.55346: results queue empty 8119 1726773034.55348: checking for any_errors_fatal 8119 1726773034.55353: done checking for any_errors_fatal 8119 1726773034.55356: checking for max_fail_percentage 8119 1726773034.55359: done checking for max_fail_percentage 8119 1726773034.55361: checking to see if all hosts have failed and the running result is not ok 8119 1726773034.55363: done checking to see if all hosts have failed 8119 1726773034.55365: getting the remaining hosts for this loop 8119 1726773034.55368: done getting the remaining hosts for this loop 8119 1726773034.55376: building list of next tasks for hosts 8119 1726773034.55380: getting the next task for host managed_node2 8119 1726773034.55389: done getting next task for host managed_node2 8119 1726773034.55394: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773034.55399: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773034.55402: done building task lists 8119 1726773034.55404: counting tasks in each state of execution 8119 1726773034.55408: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773034.55410: advancing hosts in ITERATING_TASKS 8119 1726773034.55413: starting to advance hosts 8119 1726773034.55415: getting the next task for host managed_node2 8119 1726773034.55419: done getting next task for host managed_node2 8119 1726773034.55422: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773034.55426: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773034.55428: done advancing hosts to next task 8119 1726773034.55444: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773034.55449: getting variables 8119 1726773034.55453: in VariableManager get_vars() 8119 1726773034.55490: Calling all_inventory to load vars for managed_node2 8119 1726773034.55497: Calling groups_inventory to load vars for managed_node2 8119 1726773034.55500: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773034.55530: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.55545: Calling all_plugins_play to load vars for managed_node2 8119 1726773034.55561: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.55574: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773034.55593: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.55604: Calling groups_plugins_play to load vars for managed_node2 8119 1726773034.55620: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.55650: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.55672: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773034.56010: done with get_vars() 8119 1726773034.56025: done getting variables 8119 1726773034.56032: sending task start callback, copying the task so we can template it temporarily 8119 1726773034.56034: done copying, going to template now 8119 1726773034.56037: done templating 8119 1726773034.56040: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:10:34 -0400 (0:00:00.844) 0:00:29.117 **** 8119 1726773034.56062: sending task start callback 8119 1726773034.56066: entering _queue_task() for managed_node2/service 8119 1726773034.56212: worker is 1 (out of 1 available) 8119 1726773034.56248: exiting _queue_task() for managed_node2/service 8119 1726773034.56344: done queuing things up, now waiting for results queue to drain 8119 1726773034.56349: waiting for pending results... 9167 1726773034.56580: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9167 1726773034.56649: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000b9 9167 1726773034.59252: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9167 1726773034.59375: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9167 1726773034.59450: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9167 1726773034.59492: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9167 1726773034.59535: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9167 1726773034.59577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9167 1726773034.59638: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9167 1726773034.59669: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9167 1726773034.59693: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9167 1726773034.59804: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9167 1726773034.59829: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9167 1726773034.59852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9167 1726773034.60053: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9167 1726773034.60058: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9167 1726773034.60062: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9167 1726773034.60067: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9167 1726773034.60070: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9167 1726773034.60073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773034.60076: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9167 1726773034.60079: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9167 1726773034.60082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9167 1726773034.63202: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9167 1726773034.63209: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9167 1726773034.63212: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773034.63331: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9167 1726773034.63336: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9167 1726773034.63338: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9167 1726773034.63340: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9167 1726773034.63342: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9167 1726773034.63344: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773034.63346: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9167 1726773034.63348: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9167 1726773034.63349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9167 1726773034.63374: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9167 1726773034.63379: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9167 1726773034.63381: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773034.63719: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9167 1726773034.63777: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9167 1726773034.63791: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9167 1726773034.63807: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9167 1726773034.63814: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9167 1726773034.63921: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9167 1726773034.63930: starting attempt loop 9167 1726773034.63933: running the handler 9167 1726773034.64090: _low_level_execute_command(): starting 9167 1726773034.64097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9167 1726773034.66979: stdout chunk (state=2): >>>/root <<< 9167 1726773034.67115: stderr chunk (state=3): >>><<< 9167 1726773034.67123: stdout chunk (state=3): >>><<< 9167 1726773034.67150: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9167 1726773034.67169: _low_level_execute_command(): starting 9167 1726773034.67177: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015 `" && echo ansible-tmp-1726773034.6716135-9167-112572663924015="` echo /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015 `" ) && sleep 0' 9167 1726773034.71306: stdout chunk (state=2): >>>ansible-tmp-1726773034.6716135-9167-112572663924015=/root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015 <<< 9167 1726773034.71348: stderr chunk (state=3): >>><<< 9167 1726773034.71355: stdout chunk (state=3): >>><<< 9167 1726773034.71378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773034.6716135-9167-112572663924015=/root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015 , stderr= 9167 1726773034.71522: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 9167 1726773034.71634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/AnsiballZ_systemd.py 9167 1726773034.73015: Sending initial data 9167 1726773034.73030: Sent initial data (154 bytes) 9167 1726773034.76561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpg_ov_ut8 /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/AnsiballZ_systemd.py <<< 9167 1726773034.79008: stderr chunk (state=3): >>><<< 9167 1726773034.79019: stdout chunk (state=3): >>><<< 9167 1726773034.79050: done transferring module to remote 9167 1726773034.79067: _low_level_execute_command(): starting 9167 1726773034.79074: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/ /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/AnsiballZ_systemd.py && sleep 0' 9167 1726773034.82276: stderr chunk (state=2): >>><<< 9167 1726773034.82296: stdout chunk (state=2): >>><<< 9167 1726773034.82327: _low_level_execute_command() done: rc=0, stdout=, stderr= 9167 1726773034.82334: _low_level_execute_command(): starting 9167 1726773034.82344: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/AnsiballZ_systemd.py && sleep 0' 9167 1726773035.33776: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8091", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 9167 1726773035.35455: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9167 1726773035.35466: stdout chunk (state=3): >>><<< 9167 1726773035.35480: stderr chunk (state=3): >>><<< 9167 1726773035.35504: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8091", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "Before": "shutdown.target multi-user.target", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9167 1726773035.35656: done with _execute_module (systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9167 1726773035.35678: _low_level_execute_command(): starting 9167 1726773035.35689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773034.6716135-9167-112572663924015/ > /dev/null 2>&1 && sleep 0' 9167 1726773035.38863: stderr chunk (state=2): >>><<< 9167 1726773035.38880: stdout chunk (state=2): >>><<< 9167 1726773035.38910: _low_level_execute_command() done: rc=0, stdout=, stderr= 9167 1726773035.38924: handler run complete 9167 1726773035.38931: attempt loop complete, returning result 9167 1726773035.39019: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9167 1726773035.39028: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9167 1726773035.39032: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9167 1726773035.39036: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9167 1726773035.39040: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9167 1726773035.39044: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773035.39047: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9167 1726773035.39051: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9167 1726773035.39055: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9167 1726773035.39108: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9167 1726773035.39114: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9167 1726773035.39119: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9167 1726773035.39362: dumping result to json 9167 1726773035.39392: done dumping result, returning 9167 1726773035.39410: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [12a3200b-1e9d-1dbd-cc52-0000000000b9] 9167 1726773035.39421: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b9 9167 1726773035.39425: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000b9 9167 1726773035.39427: WORKER PROCESS EXITING changed: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveEnterTimestampMonotonic": "423997735", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ActiveExitTimestampMonotonic": "423641289", "ActiveState": "active", "After": "sysinit.target basic.target systemd-sysctl.service dbus.socket network.target system.slice systemd-journald.socket polkit.service dbus.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:24 EDT", "AssertTimestampMonotonic": "423748387", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ConditionTimestampMonotonic": "423748386", "ConfigurationDirectoryMode": "0755", "Conflicts": "tlp.service auto-cpufreq.service shutdown.target power-profiles-daemon.service cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8091", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:24 EDT", "ExecMainStartTimestampMonotonic": "423749407", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:24 EDT] ; stop_time=[n/a] ; pid=8091 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveEnterTimestampMonotonic": "423746351", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:24 EDT", "InactiveExitTimestampMonotonic": "423749579", "InvocationID": "6ea8da0cb91e4e59bcedd6fa8fbcc8cd", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8091", "MemoryAccounting": "yes", "MemoryCurrent": "15007744", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:24 EDT", "StateChangeTimestampMonotonic": "423997735", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:24 EDT", "WatchdogTimestampMonotonic": "423997731", "WatchdogUSec": "0" } } 8119 1726773035.40578: no more pending results, returning what we have 8119 1726773035.40588: results queue empty 8119 1726773035.40591: checking for any_errors_fatal 8119 1726773035.40599: done checking for any_errors_fatal 8119 1726773035.40601: checking for max_fail_percentage 8119 1726773035.40604: done checking for max_fail_percentage 8119 1726773035.40607: checking to see if all hosts have failed and the running result is not ok 8119 1726773035.40609: done checking to see if all hosts have failed 8119 1726773035.40611: getting the remaining hosts for this loop 8119 1726773035.40614: done getting the remaining hosts for this loop 8119 1726773035.40621: building list of next tasks for hosts 8119 1726773035.40624: getting the next task for host managed_node2 8119 1726773035.40632: done getting next task for host managed_node2 8119 1726773035.40636: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773035.40640: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.40642: done building task lists 8119 1726773035.40644: counting tasks in each state of execution 8119 1726773035.40648: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773035.40650: advancing hosts in ITERATING_TASKS 8119 1726773035.40652: starting to advance hosts 8119 1726773035.40654: getting the next task for host managed_node2 8119 1726773035.40658: done getting next task for host managed_node2 8119 1726773035.40661: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773035.40663: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.40665: done advancing hosts to next task 8119 1726773035.40726: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773035.40733: getting variables 8119 1726773035.40737: in VariableManager get_vars() 8119 1726773035.40773: Calling all_inventory to load vars for managed_node2 8119 1726773035.40779: Calling groups_inventory to load vars for managed_node2 8119 1726773035.40785: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773035.40816: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.40833: Calling all_plugins_play to load vars for managed_node2 8119 1726773035.40851: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.40866: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773035.40887: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.40900: Calling groups_plugins_play to load vars for managed_node2 8119 1726773035.40917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.40949: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.40973: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.41325: done with get_vars() 8119 1726773035.41338: done getting variables 8119 1726773035.41345: sending task start callback, copying the task so we can template it temporarily 8119 1726773035.41348: done copying, going to template now 8119 1726773035.41351: done templating 8119 1726773035.41353: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.853) 0:00:29.970 **** 8119 1726773035.41376: sending task start callback 8119 1726773035.41379: entering _queue_task() for managed_node2/command 8119 1726773035.41382: Creating lock for command 8119 1726773035.41565: worker is 1 (out of 1 available) 8119 1726773035.41599: exiting _queue_task() for managed_node2/command 8119 1726773035.41661: done queuing things up, now waiting for results queue to drain 8119 1726773035.41666: waiting for pending results... 9228 1726773035.41904: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9228 1726773035.41972: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000ba 9228 1726773035.42027: calling self._execute() 9228 1726773035.44458: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9228 1726773035.44573: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9228 1726773035.44645: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9228 1726773035.44685: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9228 1726773035.44728: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9228 1726773035.44768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9228 1726773035.44827: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9228 1726773035.44859: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9228 1726773035.44886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9228 1726773035.45017: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9228 1726773035.45056: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9228 1726773035.45080: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9228 1726773035.45429: when evaluation is False, skipping this task 9228 1726773035.45435: _execute() done 9228 1726773035.45438: dumping result to json 9228 1726773035.45440: done dumping result, returning 9228 1726773035.45447: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [12a3200b-1e9d-1dbd-cc52-0000000000ba] 9228 1726773035.45459: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ba 9228 1726773035.45789: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000ba 9228 1726773035.45794: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773035.46205: no more pending results, returning what we have 8119 1726773035.46214: results queue empty 8119 1726773035.46217: checking for any_errors_fatal 8119 1726773035.46228: done checking for any_errors_fatal 8119 1726773035.46230: checking for max_fail_percentage 8119 1726773035.46234: done checking for max_fail_percentage 8119 1726773035.46236: checking to see if all hosts have failed and the running result is not ok 8119 1726773035.46238: done checking to see if all hosts have failed 8119 1726773035.46240: getting the remaining hosts for this loop 8119 1726773035.46243: done getting the remaining hosts for this loop 8119 1726773035.46251: building list of next tasks for hosts 8119 1726773035.46254: getting the next task for host managed_node2 8119 1726773035.46262: done getting next task for host managed_node2 8119 1726773035.46268: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773035.46273: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.46276: done building task lists 8119 1726773035.46278: counting tasks in each state of execution 8119 1726773035.46284: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773035.46287: advancing hosts in ITERATING_TASKS 8119 1726773035.46290: starting to advance hosts 8119 1726773035.46292: getting the next task for host managed_node2 8119 1726773035.46297: done getting next task for host managed_node2 8119 1726773035.46301: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773035.46304: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.46307: done advancing hosts to next task 8119 1726773035.46326: getting variables 8119 1726773035.46331: in VariableManager get_vars() 8119 1726773035.46368: Calling all_inventory to load vars for managed_node2 8119 1726773035.46374: Calling groups_inventory to load vars for managed_node2 8119 1726773035.46378: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773035.46413: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46430: Calling all_plugins_play to load vars for managed_node2 8119 1726773035.46449: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46464: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773035.46485: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46497: Calling groups_plugins_play to load vars for managed_node2 8119 1726773035.46519: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46552: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46577: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.46934: done with get_vars() 8119 1726773035.46948: done getting variables 8119 1726773035.46955: sending task start callback, copying the task so we can template it temporarily 8119 1726773035.46958: done copying, going to template now 8119 1726773035.46961: done templating 8119 1726773035.46963: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.056) 0:00:30.026 **** 8119 1726773035.46989: sending task start callback 8119 1726773035.46993: entering _queue_task() for managed_node2/include_tasks 8119 1726773035.47146: worker is 1 (out of 1 available) 8119 1726773035.47196: exiting _queue_task() for managed_node2/include_tasks 8119 1726773035.47271: done queuing things up, now waiting for results queue to drain 8119 1726773035.47276: waiting for pending results... 9231 1726773035.47497: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9231 1726773035.47572: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000bb 9231 1726773035.47632: calling self._execute() 9231 1726773035.49779: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9231 1726773035.49899: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9231 1726773035.49975: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9231 1726773035.50019: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9231 1726773035.50061: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9231 1726773035.50103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9231 1726773035.50165: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9231 1726773035.50198: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9231 1726773035.50227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9231 1726773035.50342: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9231 1726773035.50366: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9231 1726773035.50408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9231 1726773035.50765: _execute() done 9231 1726773035.50770: dumping result to json 9231 1726773035.50773: done dumping result, returning 9231 1726773035.50780: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [12a3200b-1e9d-1dbd-cc52-0000000000bb] 9231 1726773035.50795: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bb 9231 1726773035.50830: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bb 9231 1726773035.50835: WORKER PROCESS EXITING 8119 1726773035.51216: no more pending results, returning what we have 8119 1726773035.51227: in VariableManager get_vars() 8119 1726773035.51275: Calling all_inventory to load vars for managed_node2 8119 1726773035.51282: Calling groups_inventory to load vars for managed_node2 8119 1726773035.51289: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773035.51327: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51345: Calling all_plugins_play to load vars for managed_node2 8119 1726773035.51365: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51381: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773035.51402: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51416: Calling groups_plugins_play to load vars for managed_node2 8119 1726773035.51435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51467: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51494: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.51898: done with get_vars() 8119 1726773035.51961: we have included files to process 8119 1726773035.51966: generating all_blocks data 8119 1726773035.51970: done generating all_blocks data 8119 1726773035.51975: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773035.51978: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773035.51985: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8119 1726773035.52402: done processing included file 8119 1726773035.52405: iterating over new_blocks loaded from include file 8119 1726773035.52408: in VariableManager get_vars() 8119 1726773035.52439: done with get_vars() 8119 1726773035.52443: filtering new block on tags 8119 1726773035.52535: done filtering new block on tags 8119 1726773035.52550: done iterating over new_blocks loaded from include file 8119 1726773035.52553: extending task lists for all hosts with included blocks 8119 1726773035.53148: done extending task lists 8119 1726773035.53153: done processing included files 8119 1726773035.53156: results queue empty 8119 1726773035.53158: checking for any_errors_fatal 8119 1726773035.53161: done checking for any_errors_fatal 8119 1726773035.53164: checking for max_fail_percentage 8119 1726773035.53166: done checking for max_fail_percentage 8119 1726773035.53168: checking to see if all hosts have failed and the running result is not ok 8119 1726773035.53170: done checking to see if all hosts have failed 8119 1726773035.53172: getting the remaining hosts for this loop 8119 1726773035.53175: done getting the remaining hosts for this loop 8119 1726773035.53181: building list of next tasks for hosts 8119 1726773035.53187: getting the next task for host managed_node2 8119 1726773035.53193: done getting next task for host managed_node2 8119 1726773035.53197: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773035.53202: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.53205: done building task lists 8119 1726773035.53207: counting tasks in each state of execution 8119 1726773035.53213: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773035.53216: advancing hosts in ITERATING_TASKS 8119 1726773035.53218: starting to advance hosts 8119 1726773035.53220: getting the next task for host managed_node2 8119 1726773035.53225: done getting next task for host managed_node2 8119 1726773035.53229: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773035.53233: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773035.53235: done advancing hosts to next task 8119 1726773035.53243: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773035.53246: getting variables 8119 1726773035.53249: in VariableManager get_vars() 8119 1726773035.53268: Calling all_inventory to load vars for managed_node2 8119 1726773035.53272: Calling groups_inventory to load vars for managed_node2 8119 1726773035.53276: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773035.53313: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53326: Calling all_plugins_play to load vars for managed_node2 8119 1726773035.53344: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53358: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773035.53377: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53392: Calling groups_plugins_play to load vars for managed_node2 8119 1726773035.53415: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53446: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53470: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773035.53789: done with get_vars() 8119 1726773035.53802: done getting variables 8119 1726773035.53812: sending task start callback, copying the task so we can template it temporarily 8119 1726773035.53815: done copying, going to template now 8119 1726773035.53818: done templating 8119 1726773035.53821: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.068) 0:00:30.094 **** 8119 1726773035.53843: sending task start callback 8119 1726773035.53846: entering _queue_task() for managed_node2/command 8119 1726773035.54016: worker is 1 (out of 1 available) 8119 1726773035.54050: exiting _queue_task() for managed_node2/command 8119 1726773035.54123: done queuing things up, now waiting for results queue to drain 8119 1726773035.54128: waiting for pending results... 9234 1726773035.54397: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 9234 1726773035.54476: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000001c3 9234 1726773035.54536: calling self._execute() 9234 1726773035.54767: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9234 1726773035.54829: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9234 1726773035.54845: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9234 1726773035.54862: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9234 1726773035.54871: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9234 1726773035.55037: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9234 1726773035.55061: starting attempt loop 9234 1726773035.55065: running the handler 9234 1726773035.55078: _low_level_execute_command(): starting 9234 1726773035.55087: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9234 1726773035.58000: stdout chunk (state=2): >>>/root <<< 9234 1726773035.58145: stderr chunk (state=3): >>><<< 9234 1726773035.58152: stdout chunk (state=3): >>><<< 9234 1726773035.58184: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9234 1726773035.58205: _low_level_execute_command(): starting 9234 1726773035.58216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164 `" && echo ansible-tmp-1726773035.5819583-9234-266027347690164="` echo /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164 `" ) && sleep 0' 9234 1726773035.61558: stdout chunk (state=2): >>>ansible-tmp-1726773035.5819583-9234-266027347690164=/root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164 <<< 9234 1726773035.61577: stderr chunk (state=2): >>><<< 9234 1726773035.61591: stdout chunk (state=3): >>><<< 9234 1726773035.61617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773035.5819583-9234-266027347690164=/root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164 , stderr= 9234 1726773035.61781: ANSIBALLZ: Using lock for command 9234 1726773035.61788: ANSIBALLZ: Acquiring lock 9234 1726773035.61794: ANSIBALLZ: Lock acquired: 140408669159728 9234 1726773035.61798: ANSIBALLZ: Creating module 9234 1726773035.77125: ANSIBALLZ: Writing module into payload 9234 1726773035.77426: ANSIBALLZ: Writing module 9234 1726773035.77447: ANSIBALLZ: Renaming module 9234 1726773035.77453: ANSIBALLZ: Done creating module 9234 1726773035.77487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/AnsiballZ_command.py 9234 1726773035.78290: Sending initial data 9234 1726773035.78304: Sent initial data (154 bytes) 9234 1726773035.80904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqv_przyu /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/AnsiballZ_command.py <<< 9234 1726773035.83135: stderr chunk (state=3): >>><<< 9234 1726773035.83142: stdout chunk (state=3): >>><<< 9234 1726773035.83171: done transferring module to remote 9234 1726773035.83191: _low_level_execute_command(): starting 9234 1726773035.83200: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/ /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/AnsiballZ_command.py && sleep 0' 9234 1726773035.86180: stderr chunk (state=2): >>><<< 9234 1726773035.86196: stdout chunk (state=2): >>><<< 9234 1726773035.86221: _low_level_execute_command() done: rc=0, stdout=, stderr= 9234 1726773035.86227: _low_level_execute_command(): starting 9234 1726773035.86235: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/AnsiballZ_command.py && sleep 0' 9234 1726773036.14417: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:36.016455", "end": "2024-09-19 15:10:36.141818", "delta": "0:00:00.125363", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9234 1726773036.15671: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9234 1726773036.15681: stdout chunk (state=3): >>><<< 9234 1726773036.15697: stderr chunk (state=3): >>><<< 9234 1726773036.15719: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:36.016455", "end": "2024-09-19 15:10:36.141818", "delta": "0:00:00.125363", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9234 1726773036.15764: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9234 1726773036.15777: _low_level_execute_command(): starting 9234 1726773036.15787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773035.5819583-9234-266027347690164/ > /dev/null 2>&1 && sleep 0' 9234 1726773036.18799: stderr chunk (state=2): >>><<< 9234 1726773036.18816: stdout chunk (state=2): >>><<< 9234 1726773036.18842: _low_level_execute_command() done: rc=0, stdout=, stderr= 9234 1726773036.18852: handler run complete 9234 1726773036.18901: attempt loop complete, returning result 9234 1726773036.18918: _execute() done 9234 1726773036.18922: dumping result to json 9234 1726773036.18928: done dumping result, returning 9234 1726773036.18943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [12a3200b-1e9d-1dbd-cc52-0000000001c3] 9234 1726773036.18961: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c3 9234 1726773036.19101: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c3 9234 1726773036.19108: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.125363", "end": "2024-09-19 15:10:36.141818", "rc": 0, "start": "2024-09-19 15:10:36.016455" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773036.19663: no more pending results, returning what we have 8119 1726773036.19670: results queue empty 8119 1726773036.19672: checking for any_errors_fatal 8119 1726773036.19676: done checking for any_errors_fatal 8119 1726773036.19678: checking for max_fail_percentage 8119 1726773036.19681: done checking for max_fail_percentage 8119 1726773036.19685: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.19688: done checking to see if all hosts have failed 8119 1726773036.19690: getting the remaining hosts for this loop 8119 1726773036.19693: done getting the remaining hosts for this loop 8119 1726773036.19700: building list of next tasks for hosts 8119 1726773036.19703: getting the next task for host managed_node2 8119 1726773036.19711: done getting next task for host managed_node2 8119 1726773036.19716: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773036.19721: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.19724: done building task lists 8119 1726773036.19726: counting tasks in each state of execution 8119 1726773036.19731: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.19733: advancing hosts in ITERATING_TASKS 8119 1726773036.19735: starting to advance hosts 8119 1726773036.19738: getting the next task for host managed_node2 8119 1726773036.19742: done getting next task for host managed_node2 8119 1726773036.19746: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773036.19749: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.19752: done advancing hosts to next task 8119 1726773036.19810: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773036.19818: getting variables 8119 1726773036.19821: in VariableManager get_vars() 8119 1726773036.19855: Calling all_inventory to load vars for managed_node2 8119 1726773036.19862: Calling groups_inventory to load vars for managed_node2 8119 1726773036.19866: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.19898: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.19915: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.19933: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.19948: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.19966: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.19976: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.19996: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.20028: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.20052: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.20381: done with get_vars() 8119 1726773036.20396: done getting variables 8119 1726773036.20404: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.20407: done copying, going to template now 8119 1726773036.20410: done templating 8119 1726773036.20412: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.665) 0:00:30.760 **** 8119 1726773036.20436: sending task start callback 8119 1726773036.20439: entering _queue_task() for managed_node2/shell 8119 1726773036.20442: Creating lock for shell 8119 1726773036.20647: worker is 1 (out of 1 available) 8119 1726773036.20685: exiting _queue_task() for managed_node2/shell 8119 1726773036.20753: done queuing things up, now waiting for results queue to drain 8119 1726773036.20758: waiting for pending results... 9288 1726773036.21019: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 9288 1726773036.21100: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000001c4 9288 1726773036.21155: calling self._execute() 9288 1726773036.26667: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9288 1726773036.26779: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9288 1726773036.26844: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9288 1726773036.26887: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9288 1726773036.26929: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9288 1726773036.26965: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9288 1726773036.27028: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9288 1726773036.27057: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9288 1726773036.27081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9288 1726773036.27185: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9288 1726773036.27210: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9288 1726773036.27231: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9288 1726773036.27551: when evaluation is False, skipping this task 9288 1726773036.27557: _execute() done 9288 1726773036.27559: dumping result to json 9288 1726773036.27562: done dumping result, returning 9288 1726773036.27568: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [12a3200b-1e9d-1dbd-cc52-0000000001c4] 9288 1726773036.27577: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c4 9288 1726773036.27759: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c4 9288 1726773036.27764: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773036.28206: no more pending results, returning what we have 8119 1726773036.28214: results queue empty 8119 1726773036.28217: checking for any_errors_fatal 8119 1726773036.28224: done checking for any_errors_fatal 8119 1726773036.28226: checking for max_fail_percentage 8119 1726773036.28230: done checking for max_fail_percentage 8119 1726773036.28232: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.28234: done checking to see if all hosts have failed 8119 1726773036.28237: getting the remaining hosts for this loop 8119 1726773036.28239: done getting the remaining hosts for this loop 8119 1726773036.28248: building list of next tasks for hosts 8119 1726773036.28251: getting the next task for host managed_node2 8119 1726773036.28260: done getting next task for host managed_node2 8119 1726773036.28265: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773036.28270: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.28273: done building task lists 8119 1726773036.28276: counting tasks in each state of execution 8119 1726773036.28281: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.28285: advancing hosts in ITERATING_TASKS 8119 1726773036.28288: starting to advance hosts 8119 1726773036.28290: getting the next task for host managed_node2 8119 1726773036.28295: done getting next task for host managed_node2 8119 1726773036.28298: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773036.28302: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.28305: done advancing hosts to next task 8119 1726773036.28323: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773036.28329: getting variables 8119 1726773036.28332: in VariableManager get_vars() 8119 1726773036.28369: Calling all_inventory to load vars for managed_node2 8119 1726773036.28375: Calling groups_inventory to load vars for managed_node2 8119 1726773036.28379: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.28413: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28430: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.28448: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28462: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.28480: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28495: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.28516: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28547: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28572: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.28953: done with get_vars() 8119 1726773036.28966: done getting variables 8119 1726773036.28973: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.28975: done copying, going to template now 8119 1726773036.28978: done templating 8119 1726773036.28980: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.085) 0:00:30.846 **** 8119 1726773036.29006: sending task start callback 8119 1726773036.29012: entering _queue_task() for managed_node2/fail 8119 1726773036.29169: worker is 1 (out of 1 available) 8119 1726773036.29206: exiting _queue_task() for managed_node2/fail 8119 1726773036.29276: done queuing things up, now waiting for results queue to drain 8119 1726773036.29281: waiting for pending results... 9292 1726773036.29503: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 9292 1726773036.29587: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000001c5 9292 1726773036.29644: calling self._execute() 9292 1726773036.35369: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9292 1726773036.35477: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9292 1726773036.35546: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9292 1726773036.35586: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9292 1726773036.35631: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9292 1726773036.35667: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9292 1726773036.35723: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9292 1726773036.35751: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9292 1726773036.35774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9292 1726773036.35901: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9292 1726773036.35929: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9292 1726773036.35948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9292 1726773036.36234: when evaluation is False, skipping this task 9292 1726773036.36240: _execute() done 9292 1726773036.36242: dumping result to json 9292 1726773036.36245: done dumping result, returning 9292 1726773036.36251: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [12a3200b-1e9d-1dbd-cc52-0000000001c5] 9292 1726773036.36260: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c5 9292 1726773036.36298: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000001c5 9292 1726773036.36303: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773036.36749: no more pending results, returning what we have 8119 1726773036.36755: results queue empty 8119 1726773036.36757: checking for any_errors_fatal 8119 1726773036.36762: done checking for any_errors_fatal 8119 1726773036.36764: checking for max_fail_percentage 8119 1726773036.36767: done checking for max_fail_percentage 8119 1726773036.36769: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.36771: done checking to see if all hosts have failed 8119 1726773036.36773: getting the remaining hosts for this loop 8119 1726773036.36776: done getting the remaining hosts for this loop 8119 1726773036.36785: building list of next tasks for hosts 8119 1726773036.36788: getting the next task for host managed_node2 8119 1726773036.36798: done getting next task for host managed_node2 8119 1726773036.36803: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773036.36808: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.36813: done building task lists 8119 1726773036.36815: counting tasks in each state of execution 8119 1726773036.36820: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.36822: advancing hosts in ITERATING_TASKS 8119 1726773036.36824: starting to advance hosts 8119 1726773036.36826: getting the next task for host managed_node2 8119 1726773036.36832: done getting next task for host managed_node2 8119 1726773036.36836: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773036.36839: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.36841: done advancing hosts to next task 8119 1726773036.36856: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773036.36861: getting variables 8119 1726773036.36865: in VariableManager get_vars() 8119 1726773036.36905: Calling all_inventory to load vars for managed_node2 8119 1726773036.36915: Calling groups_inventory to load vars for managed_node2 8119 1726773036.36919: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.36951: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.36967: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.36986: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.37002: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.37025: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.37037: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.37054: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.37088: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.37117: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.37360: done with get_vars() 8119 1726773036.37373: done getting variables 8119 1726773036.37379: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.37381: done copying, going to template now 8119 1726773036.37385: done templating 8119 1726773036.37388: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.084) 0:00:30.930 **** 8119 1726773036.37416: sending task start callback 8119 1726773036.37419: entering _queue_task() for managed_node2/set_fact 8119 1726773036.37582: worker is 1 (out of 1 available) 8119 1726773036.37624: exiting _queue_task() for managed_node2/set_fact 8119 1726773036.37703: done queuing things up, now waiting for results queue to drain 8119 1726773036.37708: waiting for pending results... 9300 1726773036.37931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9300 1726773036.37999: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000bc 9300 1726773036.38057: calling self._execute() 9300 1726773036.38389: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9300 1726773036.38447: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9300 1726773036.38463: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9300 1726773036.38479: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9300 1726773036.38492: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9300 1726773036.38662: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9300 1726773036.38675: starting attempt loop 9300 1726773036.38679: running the handler 9300 1726773036.38704: handler run complete 9300 1726773036.38713: attempt loop complete, returning result 9300 1726773036.38717: _execute() done 9300 1726773036.38720: dumping result to json 9300 1726773036.38723: done dumping result, returning 9300 1726773036.38729: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000000bc] 9300 1726773036.38739: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bc 9300 1726773036.38780: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bc 9300 1726773036.38787: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8119 1726773036.39205: no more pending results, returning what we have 8119 1726773036.39213: results queue empty 8119 1726773036.39216: checking for any_errors_fatal 8119 1726773036.39220: done checking for any_errors_fatal 8119 1726773036.39222: checking for max_fail_percentage 8119 1726773036.39226: done checking for max_fail_percentage 8119 1726773036.39228: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.39230: done checking to see if all hosts have failed 8119 1726773036.39232: getting the remaining hosts for this loop 8119 1726773036.39235: done getting the remaining hosts for this loop 8119 1726773036.39243: building list of next tasks for hosts 8119 1726773036.39247: getting the next task for host managed_node2 8119 1726773036.39254: done getting next task for host managed_node2 8119 1726773036.39258: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773036.39263: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.39266: done building task lists 8119 1726773036.39268: counting tasks in each state of execution 8119 1726773036.39272: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.39275: advancing hosts in ITERATING_TASKS 8119 1726773036.39277: starting to advance hosts 8119 1726773036.39280: getting the next task for host managed_node2 8119 1726773036.39286: done getting next task for host managed_node2 8119 1726773036.39290: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773036.39294: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.39296: done advancing hosts to next task 8119 1726773036.39314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773036.39319: getting variables 8119 1726773036.39323: in VariableManager get_vars() 8119 1726773036.39356: Calling all_inventory to load vars for managed_node2 8119 1726773036.39362: Calling groups_inventory to load vars for managed_node2 8119 1726773036.39366: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.39397: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39417: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.39435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39451: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.39470: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39482: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.39503: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39539: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39564: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.39898: done with get_vars() 8119 1726773036.39915: done getting variables 8119 1726773036.39924: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.39927: done copying, going to template now 8119 1726773036.39930: done templating 8119 1726773036.39932: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.025) 0:00:30.955 **** 8119 1726773036.39956: sending task start callback 8119 1726773036.39960: entering _queue_task() for managed_node2/set_fact 8119 1726773036.40125: worker is 1 (out of 1 available) 8119 1726773036.40163: exiting _queue_task() for managed_node2/set_fact 8119 1726773036.40237: done queuing things up, now waiting for results queue to drain 8119 1726773036.40243: waiting for pending results... 9302 1726773036.40466: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9302 1726773036.40538: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000000bd 9302 1726773036.40594: calling self._execute() 9302 1726773036.46008: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9302 1726773036.46133: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9302 1726773036.46197: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9302 1726773036.46241: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9302 1726773036.46285: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9302 1726773036.46323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9302 1726773036.46376: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9302 1726773036.46406: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9302 1726773036.46436: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9302 1726773036.46556: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9302 1726773036.46579: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9302 1726773036.46604: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9302 1726773036.46868: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9302 1726773036.46874: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 9302 1726773036.46878: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 9302 1726773036.46882: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 9302 1726773036.46888: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 9302 1726773036.46891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9302 1726773036.46894: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 9302 1726773036.46897: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9302 1726773036.46900: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9302 1726773036.46928: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9302 1726773036.46933: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9302 1726773036.46937: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9302 1726773036.46985: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9302 1726773036.47037: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9302 1726773036.47050: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9302 1726773036.47065: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9302 1726773036.47072: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9302 1726773036.47177: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9302 1726773036.47188: starting attempt loop 9302 1726773036.47192: running the handler 9302 1726773036.47201: handler run complete 9302 1726773036.47206: attempt loop complete, returning result 9302 1726773036.47211: _execute() done 9302 1726773036.47214: dumping result to json 9302 1726773036.47217: done dumping result, returning 9302 1726773036.47223: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [12a3200b-1e9d-1dbd-cc52-0000000000bd] 9302 1726773036.47231: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bd 9302 1726773036.47266: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000000bd 9302 1726773036.47270: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8119 1726773036.47597: no more pending results, returning what we have 8119 1726773036.47603: results queue empty 8119 1726773036.47606: checking for any_errors_fatal 8119 1726773036.47611: done checking for any_errors_fatal 8119 1726773036.47613: checking for max_fail_percentage 8119 1726773036.47616: done checking for max_fail_percentage 8119 1726773036.47618: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.47620: done checking to see if all hosts have failed 8119 1726773036.47622: getting the remaining hosts for this loop 8119 1726773036.47625: done getting the remaining hosts for this loop 8119 1726773036.47633: building list of next tasks for hosts 8119 1726773036.47636: getting the next task for host managed_node2 8119 1726773036.47645: done getting next task for host managed_node2 8119 1726773036.47649: ^ task is: TASK: Ensure kernel_settings_reboot_required is unset or undefined 8119 1726773036.47652: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.47655: done building task lists 8119 1726773036.47657: counting tasks in each state of execution 8119 1726773036.47661: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.47663: advancing hosts in ITERATING_TASKS 8119 1726773036.47666: starting to advance hosts 8119 1726773036.47668: getting the next task for host managed_node2 8119 1726773036.47673: done getting next task for host managed_node2 8119 1726773036.47676: ^ task is: TASK: Ensure kernel_settings_reboot_required is unset or undefined 8119 1726773036.47679: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.47681: done advancing hosts to next task 8119 1726773036.47742: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 8119 1726773036.47749: getting variables 8119 1726773036.47753: in VariableManager get_vars() 8119 1726773036.47794: Calling all_inventory to load vars for managed_node2 8119 1726773036.47801: Calling groups_inventory to load vars for managed_node2 8119 1726773036.47805: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.47834: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.47851: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.47870: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.47887: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.47907: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.47919: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.47937: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.47968: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.47994: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.51126: done with get_vars() 8119 1726773036.51146: done getting variables 8119 1726773036.51154: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.51157: done copying, going to template now 8119 1726773036.51160: done templating 8119 1726773036.51162: here goes the callback... TASK [Ensure kernel_settings_reboot_required is unset or undefined] ************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:71 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.112) 0:00:31.068 **** 8119 1726773036.51185: sending task start callback 8119 1726773036.51189: entering _queue_task() for managed_node2/assert 8119 1726773036.51192: Creating lock for assert 8119 1726773036.51386: worker is 1 (out of 1 available) 8119 1726773036.51424: exiting _queue_task() for managed_node2/assert 8119 1726773036.51497: done queuing things up, now waiting for results queue to drain 8119 1726773036.51502: waiting for pending results... 9305 1726773036.51736: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined 9305 1726773036.51793: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000014 9305 1726773036.51850: calling self._execute() 9305 1726773036.52045: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9305 1726773036.52105: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9305 1726773036.52123: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9305 1726773036.52140: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9305 1726773036.52150: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9305 1726773036.52319: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9305 1726773036.52345: starting attempt loop 9305 1726773036.52350: running the handler 9305 1726773036.54635: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9305 1726773036.54752: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9305 1726773036.54853: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9305 1726773036.54896: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9305 1726773036.54941: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9305 1726773036.54982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9305 1726773036.55043: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9305 1726773036.55075: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9305 1726773036.55116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9305 1726773036.55228: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9305 1726773036.55254: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9305 1726773036.55278: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9305 1726773036.55637: handler run complete 9305 1726773036.55645: attempt loop complete, returning result 9305 1726773036.55649: _execute() done 9305 1726773036.55651: dumping result to json 9305 1726773036.55654: done dumping result, returning 9305 1726773036.55660: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined [12a3200b-1e9d-1dbd-cc52-000000000014] 9305 1726773036.55671: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000014 9305 1726773036.56398: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000014 9305 1726773036.56404: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773036.56793: no more pending results, returning what we have 8119 1726773036.56798: results queue empty 8119 1726773036.56801: checking for any_errors_fatal 8119 1726773036.56807: done checking for any_errors_fatal 8119 1726773036.56810: checking for max_fail_percentage 8119 1726773036.56813: done checking for max_fail_percentage 8119 1726773036.56815: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.56817: done checking to see if all hosts have failed 8119 1726773036.56819: getting the remaining hosts for this loop 8119 1726773036.56822: done getting the remaining hosts for this loop 8119 1726773036.56831: building list of next tasks for hosts 8119 1726773036.56834: getting the next task for host managed_node2 8119 1726773036.56840: done getting next task for host managed_node2 8119 1726773036.56844: ^ task is: TASK: Ensure role reported changed 8119 1726773036.56848: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.56851: done building task lists 8119 1726773036.56853: counting tasks in each state of execution 8119 1726773036.56857: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.56859: advancing hosts in ITERATING_TASKS 8119 1726773036.56862: starting to advance hosts 8119 1726773036.56864: getting the next task for host managed_node2 8119 1726773036.56867: done getting next task for host managed_node2 8119 1726773036.56870: ^ task is: TASK: Ensure role reported changed 8119 1726773036.56872: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.56875: done advancing hosts to next task 8119 1726773036.56892: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773036.56897: getting variables 8119 1726773036.56901: in VariableManager get_vars() 8119 1726773036.56936: Calling all_inventory to load vars for managed_node2 8119 1726773036.56942: Calling groups_inventory to load vars for managed_node2 8119 1726773036.56947: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.56978: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.56997: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.57017: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.57032: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.57051: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.57062: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.57079: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.57112: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.57139: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.57502: done with get_vars() 8119 1726773036.57517: done getting variables 8119 1726773036.57524: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.57526: done copying, going to template now 8119 1726773036.57529: done templating 8119 1726773036.57532: here goes the callback... TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:75 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.063) 0:00:31.131 **** 8119 1726773036.57554: sending task start callback 8119 1726773036.57558: entering _queue_task() for managed_node2/assert 8119 1726773036.57715: worker is 1 (out of 1 available) 8119 1726773036.57751: exiting _queue_task() for managed_node2/assert 8119 1726773036.57862: done queuing things up, now waiting for results queue to drain 8119 1726773036.57867: waiting for pending results... 9308 1726773036.58081: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 9308 1726773036.58138: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000015 9308 1726773036.58195: calling self._execute() 9308 1726773036.58386: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9308 1726773036.58446: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9308 1726773036.58462: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9308 1726773036.58478: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9308 1726773036.58490: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9308 1726773036.58654: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9308 1726773036.58679: starting attempt loop 9308 1726773036.58685: running the handler 9308 1726773036.61008: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 9308 1726773036.61133: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 9308 1726773036.61249: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 9308 1726773036.61293: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 9308 1726773036.61335: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 9308 1726773036.61374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 9308 1726773036.61433: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 9308 1726773036.61464: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 9308 1726773036.61492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 9308 1726773036.61619: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 9308 1726773036.61645: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 9308 1726773036.61668: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 9308 1726773036.62019: handler run complete 9308 1726773036.62026: attempt loop complete, returning result 9308 1726773036.62032: _execute() done 9308 1726773036.62035: dumping result to json 9308 1726773036.62037: done dumping result, returning 9308 1726773036.62043: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [12a3200b-1e9d-1dbd-cc52-000000000015] 9308 1726773036.62054: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000015 9308 1726773036.62159: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000015 9308 1726773036.62164: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773036.62717: no more pending results, returning what we have 8119 1726773036.62724: results queue empty 8119 1726773036.62726: checking for any_errors_fatal 8119 1726773036.62733: done checking for any_errors_fatal 8119 1726773036.62735: checking for max_fail_percentage 8119 1726773036.62738: done checking for max_fail_percentage 8119 1726773036.62740: checking to see if all hosts have failed and the running result is not ok 8119 1726773036.62743: done checking to see if all hosts have failed 8119 1726773036.62745: getting the remaining hosts for this loop 8119 1726773036.62748: done getting the remaining hosts for this loop 8119 1726773036.62757: building list of next tasks for hosts 8119 1726773036.62760: getting the next task for host managed_node2 8119 1726773036.62766: done getting next task for host managed_node2 8119 1726773036.62770: ^ task is: TASK: Check sysfs after role runs 8119 1726773036.62773: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.62775: done building task lists 8119 1726773036.62777: counting tasks in each state of execution 8119 1726773036.62782: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773036.62786: advancing hosts in ITERATING_TASKS 8119 1726773036.62789: starting to advance hosts 8119 1726773036.62791: getting the next task for host managed_node2 8119 1726773036.62795: done getting next task for host managed_node2 8119 1726773036.62797: ^ task is: TASK: Check sysfs after role runs 8119 1726773036.62800: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773036.62802: done advancing hosts to next task 8119 1726773036.62820: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773036.62824: getting variables 8119 1726773036.62827: in VariableManager get_vars() 8119 1726773036.62859: Calling all_inventory to load vars for managed_node2 8119 1726773036.62864: Calling groups_inventory to load vars for managed_node2 8119 1726773036.62867: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773036.62896: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.62913: Calling all_plugins_play to load vars for managed_node2 8119 1726773036.62929: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.62942: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773036.62959: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.62971: Calling groups_plugins_play to load vars for managed_node2 8119 1726773036.62992: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.63024: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.63044: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773036.63386: done with get_vars() 8119 1726773036.63400: done getting variables 8119 1726773036.63408: sending task start callback, copying the task so we can template it temporarily 8119 1726773036.63414: done copying, going to template now 8119 1726773036.63417: done templating 8119 1726773036.63419: here goes the callback... TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:79 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.058) 0:00:31.190 **** 8119 1726773036.63443: sending task start callback 8119 1726773036.63446: entering _queue_task() for managed_node2/command 8119 1726773036.63605: worker is 1 (out of 1 available) 8119 1726773036.63644: exiting _queue_task() for managed_node2/command 8119 1726773036.63762: done queuing things up, now waiting for results queue to drain 8119 1726773036.63767: waiting for pending results... 9312 1726773036.63850: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 9312 1726773036.63906: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000016 9312 1726773036.63963: calling self._execute() 9312 1726773036.64238: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9312 1726773036.64296: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9312 1726773036.64312: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9312 1726773036.64329: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9312 1726773036.64338: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9312 1726773036.64500: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9312 1726773036.64515: starting attempt loop 9312 1726773036.64518: running the handler 9312 1726773036.64531: _low_level_execute_command(): starting 9312 1726773036.64537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9312 1726773036.67273: stdout chunk (state=2): >>>/root <<< 9312 1726773036.67409: stderr chunk (state=3): >>><<< 9312 1726773036.67417: stdout chunk (state=3): >>><<< 9312 1726773036.67446: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9312 1726773036.67467: _low_level_execute_command(): starting 9312 1726773036.67475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221 `" && echo ansible-tmp-1726773036.6745787-9312-238350966045221="` echo /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221 `" ) && sleep 0' 9312 1726773036.70385: stdout chunk (state=2): >>>ansible-tmp-1726773036.6745787-9312-238350966045221=/root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221 <<< 9312 1726773036.70537: stderr chunk (state=3): >>><<< 9312 1726773036.70545: stdout chunk (state=3): >>><<< 9312 1726773036.70571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.6745787-9312-238350966045221=/root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221 , stderr= 9312 1726773036.70735: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 9312 1726773036.70812: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/AnsiballZ_command.py 9312 1726773036.71907: Sending initial data 9312 1726773036.71922: Sent initial data (154 bytes) 9312 1726773036.74884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp2db4xcr0 /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/AnsiballZ_command.py <<< 9312 1726773036.76461: stderr chunk (state=3): >>><<< 9312 1726773036.76470: stdout chunk (state=3): >>><<< 9312 1726773036.76505: done transferring module to remote 9312 1726773036.76526: _low_level_execute_command(): starting 9312 1726773036.76534: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/ /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/AnsiballZ_command.py && sleep 0' 9312 1726773036.80167: stderr chunk (state=2): >>><<< 9312 1726773036.80182: stdout chunk (state=2): >>><<< 9312 1726773036.80210: _low_level_execute_command() done: rc=0, stdout=, stderr= 9312 1726773036.80216: _low_level_execute_command(): starting 9312 1726773036.80226: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/AnsiballZ_command.py && sleep 0' 9312 1726773037.95799: stdout chunk (state=2): >>> {"cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "stdout": "65000", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:36.951676", "end": "2024-09-19 15:10:37.955803", "delta": "0:00:01.004127", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9312 1726773037.96942: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9312 1726773037.96981: stderr chunk (state=3): >>><<< 9312 1726773037.96987: stdout chunk (state=3): >>><<< 9312 1726773037.97013: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "stdout": "65000", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:36.951676", "end": "2024-09-19 15:10:37.955803", "delta": "0:00:01.004127", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9312 1726773037.97049: done with _execute_module (command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9312 1726773037.97062: _low_level_execute_command(): starting 9312 1726773037.97067: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.6745787-9312-238350966045221/ > /dev/null 2>&1 && sleep 0' 9312 1726773037.99813: stderr chunk (state=2): >>><<< 9312 1726773037.99827: stdout chunk (state=2): >>><<< 9312 1726773037.99852: _low_level_execute_command() done: rc=0, stdout=, stderr= 9312 1726773037.99860: handler run complete 9312 1726773037.99871: attempt loop complete, returning result 9312 1726773037.99886: _execute() done 9312 1726773037.99888: dumping result to json 9312 1726773037.99892: done dumping result, returning 9312 1726773037.99903: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [12a3200b-1e9d-1dbd-cc52-000000000016] 9312 1726773037.99918: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000016 9312 1726773037.99955: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000016 9312 1726773037.99959: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:01.004127", "end": "2024-09-19 15:10:37.955803", "rc": 0, "start": "2024-09-19 15:10:36.951676" } STDOUT: 65000 8119 1726773038.00219: no more pending results, returning what we have 8119 1726773038.00225: results queue empty 8119 1726773038.00227: checking for any_errors_fatal 8119 1726773038.00231: done checking for any_errors_fatal 8119 1726773038.00233: checking for max_fail_percentage 8119 1726773038.00236: done checking for max_fail_percentage 8119 1726773038.00238: checking to see if all hosts have failed and the running result is not ok 8119 1726773038.00240: done checking to see if all hosts have failed 8119 1726773038.00242: getting the remaining hosts for this loop 8119 1726773038.00244: done getting the remaining hosts for this loop 8119 1726773038.00249: building list of next tasks for hosts 8119 1726773038.00251: getting the next task for host managed_node2 8119 1726773038.00255: done getting next task for host managed_node2 8119 1726773038.00257: ^ task is: TASK: Check sysctl after role runs 8119 1726773038.00259: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.00260: done building task lists 8119 1726773038.00261: counting tasks in each state of execution 8119 1726773038.00267: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773038.00269: advancing hosts in ITERATING_TASKS 8119 1726773038.00270: starting to advance hosts 8119 1726773038.00271: getting the next task for host managed_node2 8119 1726773038.00273: done getting next task for host managed_node2 8119 1726773038.00275: ^ task is: TASK: Check sysctl after role runs 8119 1726773038.00276: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.00277: done advancing hosts to next task 8119 1726773038.00291: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773038.00295: getting variables 8119 1726773038.00297: in VariableManager get_vars() 8119 1726773038.00323: Calling all_inventory to load vars for managed_node2 8119 1726773038.00327: Calling groups_inventory to load vars for managed_node2 8119 1726773038.00329: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773038.00351: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00360: Calling all_plugins_play to load vars for managed_node2 8119 1726773038.00370: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00379: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773038.00391: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00399: Calling groups_plugins_play to load vars for managed_node2 8119 1726773038.00415: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00448: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.00647: done with get_vars() 8119 1726773038.00657: done getting variables 8119 1726773038.00661: sending task start callback, copying the task so we can template it temporarily 8119 1726773038.00662: done copying, going to template now 8119 1726773038.00664: done templating 8119 1726773038.00665: here goes the callback... TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:83 Thursday 19 September 2024 15:10:38 -0400 (0:00:01.372) 0:00:32.563 **** 8119 1726773038.00679: sending task start callback 8119 1726773038.00681: entering _queue_task() for managed_node2/shell 8119 1726773038.00806: worker is 1 (out of 1 available) 8119 1726773038.00847: exiting _queue_task() for managed_node2/shell 8119 1726773038.00923: done queuing things up, now waiting for results queue to drain 8119 1726773038.00928: waiting for pending results... 9406 1726773038.00982: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9406 1726773038.01030: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000017 9406 1726773038.01080: calling self._execute() 9406 1726773038.01272: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9406 1726773038.01319: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9406 1726773038.01334: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9406 1726773038.01345: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9406 1726773038.01354: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9406 1726773038.01494: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9406 1726773038.01514: starting attempt loop 9406 1726773038.01517: running the handler 9406 1726773038.01524: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9406 1726773038.01542: _low_level_execute_command(): starting 9406 1726773038.01549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9406 1726773038.04073: stdout chunk (state=2): >>>/root <<< 9406 1726773038.04191: stderr chunk (state=3): >>><<< 9406 1726773038.04198: stdout chunk (state=3): >>><<< 9406 1726773038.04227: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9406 1726773038.04244: _low_level_execute_command(): starting 9406 1726773038.04250: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592 `" && echo ansible-tmp-1726773038.0423656-9406-182630177624592="` echo /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592 `" ) && sleep 0' 9406 1726773038.06972: stdout chunk (state=2): >>>ansible-tmp-1726773038.0423656-9406-182630177624592=/root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592 <<< 9406 1726773038.07100: stderr chunk (state=3): >>><<< 9406 1726773038.07107: stdout chunk (state=3): >>><<< 9406 1726773038.07132: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.0423656-9406-182630177624592=/root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592 , stderr= 9406 1726773038.07281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 9406 1726773038.07351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/AnsiballZ_command.py 9406 1726773038.08164: Sending initial data 9406 1726773038.08178: Sent initial data (154 bytes) 9406 1726773038.11090: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptx0rbclf /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/AnsiballZ_command.py <<< 9406 1726773038.12286: stderr chunk (state=3): >>><<< 9406 1726773038.12296: stdout chunk (state=3): >>><<< 9406 1726773038.12325: done transferring module to remote 9406 1726773038.12343: _low_level_execute_command(): starting 9406 1726773038.12350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/ /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/AnsiballZ_command.py && sleep 0' 9406 1726773038.15578: stderr chunk (state=2): >>><<< 9406 1726773038.15592: stdout chunk (state=2): >>><<< 9406 1726773038.15613: _low_level_execute_command() done: rc=0, stdout=, stderr= 9406 1726773038.15617: _low_level_execute_command(): starting 9406 1726773038.15629: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/AnsiballZ_command.py && sleep 0' 9406 1726773038.31023: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "stdout": "400000", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:38.299545", "end": "2024-09-19 15:10:38.308230", "delta": "0:00:00.008685", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9406 1726773038.32106: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9406 1726773038.32160: stderr chunk (state=3): >>><<< 9406 1726773038.32168: stdout chunk (state=3): >>><<< 9406 1726773038.32193: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "stdout": "400000", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:38.299545", "end": "2024-09-19 15:10:38.308230", "delta": "0:00:00.008685", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9406 1726773038.32230: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9406 1726773038.32247: _low_level_execute_command(): starting 9406 1726773038.32255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.0423656-9406-182630177624592/ > /dev/null 2>&1 && sleep 0' 9406 1726773038.34937: stderr chunk (state=2): >>><<< 9406 1726773038.34950: stdout chunk (state=2): >>><<< 9406 1726773038.34974: _low_level_execute_command() done: rc=0, stdout=, stderr= 9406 1726773038.34988: handler run complete 9406 1726773038.35000: attempt loop complete, returning result 9406 1726773038.35014: _execute() done 9406 1726773038.35016: dumping result to json 9406 1726773038.35019: done dumping result, returning 9406 1726773038.35029: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [12a3200b-1e9d-1dbd-cc52-000000000017] 9406 1726773038.35044: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000017 9406 1726773038.35091: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000017 9406 1726773038.35163: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.008685", "end": "2024-09-19 15:10:38.308230", "rc": 0, "start": "2024-09-19 15:10:38.299545" } STDOUT: 400000 8119 1726773038.35306: no more pending results, returning what we have 8119 1726773038.35314: results queue empty 8119 1726773038.35316: checking for any_errors_fatal 8119 1726773038.35321: done checking for any_errors_fatal 8119 1726773038.35323: checking for max_fail_percentage 8119 1726773038.35327: done checking for max_fail_percentage 8119 1726773038.35329: checking to see if all hosts have failed and the running result is not ok 8119 1726773038.35330: done checking to see if all hosts have failed 8119 1726773038.35332: getting the remaining hosts for this loop 8119 1726773038.35335: done getting the remaining hosts for this loop 8119 1726773038.35345: building list of next tasks for hosts 8119 1726773038.35347: getting the next task for host managed_node2 8119 1726773038.35354: done getting next task for host managed_node2 8119 1726773038.35357: ^ task is: TASK: Check sysctl after role runs 8119 1726773038.35360: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.35362: done building task lists 8119 1726773038.35363: counting tasks in each state of execution 8119 1726773038.35367: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773038.35369: advancing hosts in ITERATING_TASKS 8119 1726773038.35371: starting to advance hosts 8119 1726773038.35373: getting the next task for host managed_node2 8119 1726773038.35376: done getting next task for host managed_node2 8119 1726773038.35378: ^ task is: TASK: Check sysctl after role runs 8119 1726773038.35381: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.35384: done advancing hosts to next task 8119 1726773038.35401: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773038.35405: getting variables 8119 1726773038.35408: in VariableManager get_vars() 8119 1726773038.35438: Calling all_inventory to load vars for managed_node2 8119 1726773038.35442: Calling groups_inventory to load vars for managed_node2 8119 1726773038.35444: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773038.35467: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35478: Calling all_plugins_play to load vars for managed_node2 8119 1726773038.35498: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35517: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773038.35531: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35538: Calling groups_plugins_play to load vars for managed_node2 8119 1726773038.35548: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35565: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35579: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.35797: done with get_vars() 8119 1726773038.35808: done getting variables 8119 1726773038.35814: sending task start callback, copying the task so we can template it temporarily 8119 1726773038.35816: done copying, going to template now 8119 1726773038.35818: done templating 8119 1726773038.35819: here goes the callback... TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:89 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.351) 0:00:32.914 **** 8119 1726773038.35837: sending task start callback 8119 1726773038.35839: entering _queue_task() for managed_node2/shell 8119 1726773038.35966: worker is 1 (out of 1 available) 8119 1726773038.36007: exiting _queue_task() for managed_node2/shell 8119 1726773038.36080: done queuing things up, now waiting for results queue to drain 8119 1726773038.36088: waiting for pending results... 9425 1726773038.36141: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9425 1726773038.36182: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000018 9425 1726773038.36232: calling self._execute() 9425 1726773038.36423: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9425 1726773038.36465: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9425 1726773038.36477: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9425 1726773038.36490: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9425 1726773038.36497: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9425 1726773038.36627: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9425 1726773038.36647: starting attempt loop 9425 1726773038.36651: running the handler 9425 1726773038.36658: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9425 1726773038.36672: _low_level_execute_command(): starting 9425 1726773038.36676: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9425 1726773038.39167: stdout chunk (state=2): >>>/root <<< 9425 1726773038.39295: stderr chunk (state=3): >>><<< 9425 1726773038.39304: stdout chunk (state=3): >>><<< 9425 1726773038.39331: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9425 1726773038.39349: _low_level_execute_command(): starting 9425 1726773038.39360: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047 `" && echo ansible-tmp-1726773038.3934047-9425-248020922251047="` echo /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047 `" ) && sleep 0' 9425 1726773038.42078: stdout chunk (state=2): >>>ansible-tmp-1726773038.3934047-9425-248020922251047=/root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047 <<< 9425 1726773038.42216: stderr chunk (state=3): >>><<< 9425 1726773038.42222: stdout chunk (state=3): >>><<< 9425 1726773038.42247: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.3934047-9425-248020922251047=/root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047 , stderr= 9425 1726773038.42387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 9425 1726773038.42452: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/AnsiballZ_command.py 9425 1726773038.42796: Sending initial data 9425 1726773038.42812: Sent initial data (154 bytes) 9425 1726773038.45293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp9d52h3zk /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/AnsiballZ_command.py <<< 9425 1726773038.46310: stderr chunk (state=3): >>><<< 9425 1726773038.46319: stdout chunk (state=3): >>><<< 9425 1726773038.46343: done transferring module to remote 9425 1726773038.46361: _low_level_execute_command(): starting 9425 1726773038.46366: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/ /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/AnsiballZ_command.py && sleep 0' 9425 1726773038.48936: stderr chunk (state=2): >>><<< 9425 1726773038.48949: stdout chunk (state=2): >>><<< 9425 1726773038.48972: _low_level_execute_command() done: rc=0, stdout=, stderr= 9425 1726773038.48976: _low_level_execute_command(): starting 9425 1726773038.48985: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/AnsiballZ_command.py && sleep 0' 9425 1726773038.63948: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:38.631882", "end": "2024-09-19 15:10:38.637550", "delta": "0:00:00.005668", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9425 1726773038.64986: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9425 1726773038.65031: stderr chunk (state=3): >>><<< 9425 1726773038.65036: stdout chunk (state=3): >>><<< 9425 1726773038.65056: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:10:38.631882", "end": "2024-09-19 15:10:38.637550", "delta": "0:00:00.005668", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9425 1726773038.65095: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9425 1726773038.65107: _low_level_execute_command(): starting 9425 1726773038.65118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.3934047-9425-248020922251047/ > /dev/null 2>&1 && sleep 0' 9425 1726773038.67741: stderr chunk (state=2): >>><<< 9425 1726773038.67752: stdout chunk (state=2): >>><<< 9425 1726773038.67771: _low_level_execute_command() done: rc=0, stdout=, stderr= 9425 1726773038.67782: handler run complete 9425 1726773038.67795: attempt loop complete, returning result 9425 1726773038.67811: _execute() done 9425 1726773038.67815: dumping result to json 9425 1726773038.67819: done dumping result, returning 9425 1726773038.67830: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [12a3200b-1e9d-1dbd-cc52-000000000018] 9425 1726773038.67846: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000018 9425 1726773038.67883: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000018 9425 1726773038.67952: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.005668", "end": "2024-09-19 15:10:38.637550", "rc": 0, "start": "2024-09-19 15:10:38.631882" } 8119 1726773038.68147: no more pending results, returning what we have 8119 1726773038.68152: results queue empty 8119 1726773038.68154: checking for any_errors_fatal 8119 1726773038.68158: done checking for any_errors_fatal 8119 1726773038.68159: checking for max_fail_percentage 8119 1726773038.68161: done checking for max_fail_percentage 8119 1726773038.68162: checking to see if all hosts have failed and the running result is not ok 8119 1726773038.68164: done checking to see if all hosts have failed 8119 1726773038.68165: getting the remaining hosts for this loop 8119 1726773038.68167: done getting the remaining hosts for this loop 8119 1726773038.68173: building list of next tasks for hosts 8119 1726773038.68175: getting the next task for host managed_node2 8119 1726773038.68179: done getting next task for host managed_node2 8119 1726773038.68181: ^ task is: TASK: Reboot the machine - see if settings persist after reboot 8119 1726773038.68189: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.68193: done building task lists 8119 1726773038.68195: counting tasks in each state of execution 8119 1726773038.68200: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773038.68201: advancing hosts in ITERATING_TASKS 8119 1726773038.68203: starting to advance hosts 8119 1726773038.68205: getting the next task for host managed_node2 8119 1726773038.68208: done getting next task for host managed_node2 8119 1726773038.68212: ^ task is: TASK: Reboot the machine - see if settings persist after reboot 8119 1726773038.68214: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773038.68215: done advancing hosts to next task 8119 1726773038.68229: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773038.68232: getting variables 8119 1726773038.68234: in VariableManager get_vars() 8119 1726773038.68264: Calling all_inventory to load vars for managed_node2 8119 1726773038.68269: Calling groups_inventory to load vars for managed_node2 8119 1726773038.68271: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773038.68320: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68334: Calling all_plugins_play to load vars for managed_node2 8119 1726773038.68345: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68354: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773038.68364: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68374: Calling groups_plugins_play to load vars for managed_node2 8119 1726773038.68388: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68408: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68428: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773038.68638: done with get_vars() 8119 1726773038.68650: done getting variables 8119 1726773038.68655: sending task start callback, copying the task so we can template it temporarily 8119 1726773038.68657: done copying, going to template now 8119 1726773038.68659: done templating 8119 1726773038.68660: here goes the callback... TASK [Reboot the machine - see if settings persist after reboot] *************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.328) 0:00:33.243 **** 8119 1726773038.68675: sending task start callback 8119 1726773038.68677: entering _queue_task() for managed_node2/reboot 8119 1726773038.68790: worker is 1 (out of 1 available) 8119 1726773038.68830: exiting _queue_task() for managed_node2/reboot 8119 1726773038.68905: done queuing things up, now waiting for results queue to drain 8119 1726773038.68913: waiting for pending results... 9438 1726773038.68961: running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot 9438 1726773038.69006: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000019 9438 1726773038.69052: calling self._execute() 9438 1726773038.69202: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 9438 1726773038.69244: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9438 1726773038.69256: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 9438 1726773038.69266: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9438 1726773038.69275: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9438 1726773038.69415: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9438 1726773038.69434: starting attempt loop 9438 1726773038.69436: running the handler 9438 1726773038.69444: reboot: running setup module to get distribution 9438 1726773038.69454: _low_level_execute_command(): starting 9438 1726773038.69460: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9438 1726773038.71925: stdout chunk (state=2): >>>/root <<< 9438 1726773038.72044: stderr chunk (state=3): >>><<< 9438 1726773038.72052: stdout chunk (state=3): >>><<< 9438 1726773038.72076: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9438 1726773038.72093: _low_level_execute_command(): starting 9438 1726773038.72100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160 `" && echo ansible-tmp-1726773038.720861-9438-8084398300160="` echo /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160 `" ) && sleep 0' 9438 1726773038.74760: stdout chunk (state=2): >>>ansible-tmp-1726773038.720861-9438-8084398300160=/root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160 <<< 9438 1726773038.74882: stderr chunk (state=3): >>><<< 9438 1726773038.74889: stdout chunk (state=3): >>><<< 9438 1726773038.74908: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.720861-9438-8084398300160=/root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160 , stderr= 9438 1726773038.75002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/setup-ZIP_DEFLATED 9438 1726773038.75098: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_setup.py 9438 1726773038.75408: Sending initial data 9438 1726773038.75427: Sent initial data (149 bytes) 9438 1726773038.77865: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp9z27ar9n /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_setup.py <<< 9438 1726773038.79681: stderr chunk (state=3): >>><<< 9438 1726773038.79690: stdout chunk (state=3): >>><<< 9438 1726773038.79721: done transferring module to remote 9438 1726773038.79736: _low_level_execute_command(): starting 9438 1726773038.79741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/ /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_setup.py && sleep 0' 9438 1726773038.82329: stderr chunk (state=2): >>><<< 9438 1726773038.82345: stdout chunk (state=2): >>><<< 9438 1726773038.82370: _low_level_execute_command() done: rc=0, stdout=, stderr= 9438 1726773038.82375: _low_level_execute_command(): starting 9438 1726773038.82382: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_setup.py && sleep 0' 9438 1726773039.08241: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "NA", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALFZBDdfqkngrfgrSSO/kOko3izToWrd6nOpiEjdLVtcXzovVx/6erjnSGabbSOWK8kW9F5xCwoGapRqymVUpYt2iP3WjMoraiZe76U/z6uZzecAsA1RHY0a1aNyRnOVuT0MwGPdOrPG0IUEHdcjNFPIyIR7a1a7jQuuK/VUomILAAAAFQCa+HW/TznpkY1Sjf2CFciRg0F35QAAAIBWyN49dx0xdzzoLQwc1selHyVtecBGccevTf1vXKodpJhjh6cK8qY+3hXglj10iG4E6TtsmyqPol1hCZfFivhH22g02zVJ+hqKtiliw1mg3bP/lOHHTfHADF8cFnZIBCbWAu6XFID+j0R5RJAaZYrnOht9+1c+fjumGg9DDWqgQQAAAIEAoc1c1DCr/HzzPvbX6dfCKdaFtfLJNHCDDyFpuCKPB2NxRjyz5zkgd9ECD5Db2fyiB7rrkVpKgl8MnJ3ERomNMEakr1OsEhnkLz6QnAKkJ27EIvUbiucfxnFY0Mdr+F0O32CpkSFIpOVhHqfa31c2jqBuLUn3A0kvZR5zIvnlviY=", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDX9Txnu9uEFeu0KZgsdvC4y52gHMP2bfEsIr97BRES1SRGlVvZsrGojsWPQfeFjIxtnUYjZv/DvIzys5NaA3SoIa95g/0tFTtlyzf98gQdtW1WMnnIsmnj6zaOoRYUhkEBR20EF1Yg32E7aTDyyGVTK/TVsoH3XeAqwbhznS1EGFxqyJhKP4NccmB0G2bwjxAMEGt/YPunlhmVPiOTFNqKeUc1BiUQQUbEfmkDZW7GTv3YDDw9KDrwHipZwfHyZgq/6A6QlKXlx1ddePP3sey+9i/3o9HUMyrEkfPLZYyiM2LbRMZ/1NOrsDuKxKTt+UeXP8HDWhxx9HNTSg9VR5lsgAH/t8QfxJYnkMpwkfOnqp9a/uVXqAccpQzKPjgbfKdmvbMeQEr4CFnAr8wNEPVdyBGYWC/tCRgAvnyMZ+QX/C/Yr1c0NVdBe24CDYz6txO0kKeJJiuGv0Lw3qlo5+r1MLdsxkN6IHzgJub9C3BG21hxi5jwQiv73IUmZJIf600=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2<<< 9438 1726773039.08271: stdout chunk (state=3): >>>VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFX6Zs61bYvLLRIbM0riEKY9ACddaTbxcWPyntGg+v54psh7ooWEQSJH51NOypf9DqjWEdfAXxZnbti1GFYn1tA=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAsN4dU//fXTrMOrPWiVNgqCSga1BeU/yxcnkfUXDgWW", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-8-150", "ansible_nodename": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "c3e1d9fa8b684f3abae9fbe37bdf8cdf", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.11.49 54512 10.31.8.150 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.11.49 54512 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "39", "epoch": "1726773039", "date": "2024-09-19", "time": "15:10:39", "iso8601_micro": "2024-09-19T19:10:39.079141Z", "iso8601": "2024-09-19T19:10:39Z", "iso8601_basic": "20240919T151039079141", "iso8601_basic_short": "20240919T151039", "tz": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}} <<< 9438 1726773039.09794: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773039.09804: stdout chunk (state=3): >>><<< 9438 1726773039.09818: stderr chunk (state=3): >>><<< 9438 1726773039.09841: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "NA", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALFZBDdfqkngrfgrSSO/kOko3izToWrd6nOpiEjdLVtcXzovVx/6erjnSGabbSOWK8kW9F5xCwoGapRqymVUpYt2iP3WjMoraiZe76U/z6uZzecAsA1RHY0a1aNyRnOVuT0MwGPdOrPG0IUEHdcjNFPIyIR7a1a7jQuuK/VUomILAAAAFQCa+HW/TznpkY1Sjf2CFciRg0F35QAAAIBWyN49dx0xdzzoLQwc1selHyVtecBGccevTf1vXKodpJhjh6cK8qY+3hXglj10iG4E6TtsmyqPol1hCZfFivhH22g02zVJ+hqKtiliw1mg3bP/lOHHTfHADF8cFnZIBCbWAu6XFID+j0R5RJAaZYrnOht9+1c+fjumGg9DDWqgQQAAAIEAoc1c1DCr/HzzPvbX6dfCKdaFtfLJNHCDDyFpuCKPB2NxRjyz5zkgd9ECD5Db2fyiB7rrkVpKgl8MnJ3ERomNMEakr1OsEhnkLz6QnAKkJ27EIvUbiucfxnFY0Mdr+F0O32CpkSFIpOVhHqfa31c2jqBuLUn3A0kvZR5zIvnlviY=", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDX9Txnu9uEFeu0KZgsdvC4y52gHMP2bfEsIr97BRES1SRGlVvZsrGojsWPQfeFjIxtnUYjZv/DvIzys5NaA3SoIa95g/0tFTtlyzf98gQdtW1WMnnIsmnj6zaOoRYUhkEBR20EF1Yg32E7aTDyyGVTK/TVsoH3XeAqwbhznS1EGFxqyJhKP4NccmB0G2bwjxAMEGt/YPunlhmVPiOTFNqKeUc1BiUQQUbEfmkDZW7GTv3YDDw9KDrwHipZwfHyZgq/6A6QlKXlx1ddePP3sey+9i/3o9HUMyrEkfPLZYyiM2LbRMZ/1NOrsDuKxKTt+UeXP8HDWhxx9HNTSg9VR5lsgAH/t8QfxJYnkMpwkfOnqp9a/uVXqAccpQzKPjgbfKdmvbMeQEr4CFnAr8wNEPVdyBGYWC/tCRgAvnyMZ+QX/C/Yr1c0NVdBe24CDYz6txO0kKeJJiuGv0Lw3qlo5+r1MLdsxkN6IHzgJub9C3BG21hxi5jwQiv73IUmZJIf600=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFX6Zs61bYvLLRIbM0riEKY9ACddaTbxcWPyntGg+v54psh7ooWEQSJH51NOypf9DqjWEdfAXxZnbti1GFYn1tA=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAsN4dU//fXTrMOrPWiVNgqCSga1BeU/yxcnkfUXDgWW", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-8-150", "ansible_nodename": "ip-10-31-8-150.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "c3e1d9fa8b684f3abae9fbe37bdf8cdf", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.11.49 54512 10.31.8.150 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.11.49 54512 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "39", "epoch": "1726773039", "date": "2024-09-19", "time": "15:10:39", "iso8601_micro": "2024-09-19T19:10:39.079141Z", "iso8601": "2024-09-19T19:10:39Z", "iso8601_basic": "20240919T151039079141", "iso8601_basic_short": "20240919T151039", "tz": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773039.09992: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9438 1726773039.10004: reboot: distribution: {'name': 'centos', 'version': '8', 'family': 'redhat'} 9438 1726773039.10016: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9438 1726773039.10020: _low_level_execute_command(): starting 9438 1726773039.10028: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9438 1726773039.12854: stdout chunk (state=2): >>>e402a296-da7b-4f32-93da-fedce22c5dec <<< 9438 1726773039.12953: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773039.12995: stderr chunk (state=3): >>><<< 9438 1726773039.13000: stdout chunk (state=3): >>><<< 9438 1726773039.13025: _low_level_execute_command() done: rc=0, stdout=e402a296-da7b-4f32-93da-fedce22c5dec , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773039.13030: reboot: last boot time: e402a296-da7b-4f32-93da-fedce22c5dec 9438 1726773039.13044: reboot: connect_timeout connection option has not been set 9438 1726773039.13053: reboot: running find module looking in ['/sbin', '/usr/sbin', '/usr/local/sbin'] to get path for "shutdown" 9438 1726773039.13169: ANSIBALLZ: Using generic lock for find 9438 1726773039.13174: ANSIBALLZ: Acquiring lock 9438 1726773039.13178: ANSIBALLZ: Lock acquired: 140408695168400 9438 1726773039.13180: ANSIBALLZ: Creating module 9438 1726773039.20367: ANSIBALLZ: Writing module into payload 9438 1726773039.20461: ANSIBALLZ: Writing module 9438 1726773039.20479: ANSIBALLZ: Renaming module 9438 1726773039.20484: ANSIBALLZ: Done creating module 9438 1726773039.20507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_find.py 9438 1726773039.20854: Sending initial data 9438 1726773039.20869: Sent initial data (148 bytes) 9438 1726773039.23425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp_b7xjz3y /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_find.py <<< 9438 1726773039.24429: stderr chunk (state=3): >>><<< 9438 1726773039.24438: stdout chunk (state=3): >>><<< 9438 1726773039.24462: done transferring module to remote 9438 1726773039.24475: _low_level_execute_command(): starting 9438 1726773039.24482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/ /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_find.py && sleep 0' 9438 1726773039.27139: stderr chunk (state=2): >>><<< 9438 1726773039.27155: stdout chunk (state=2): >>><<< 9438 1726773039.27178: _low_level_execute_command() done: rc=0, stdout=, stderr= 9438 1726773039.27185: _low_level_execute_command(): starting 9438 1726773039.27192: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/AnsiballZ_find.py && sleep 0' 9438 1726773039.47634: stdout chunk (state=2): >>> {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773039.416316, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773039.416316, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "", "matched": 2, "examined": 736, "invocation": {"module_args": {"paths": ["/sbin", "/usr/sbin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "excludes": null, "contains": null, "age": null, "size": null, "depth": null}}} <<< 9438 1726773039.48660: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773039.48719: stderr chunk (state=3): >>><<< 9438 1726773039.48725: stdout chunk (state=3): >>><<< 9438 1726773039.48746: _low_level_execute_command() done: rc=0, stdout= {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773039.416316, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773039.416316, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "", "matched": 2, "examined": 736, "invocation": {"module_args": {"paths": ["/sbin", "/usr/sbin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "excludes": null, "contains": null, "age": null, "size": null, "depth": null}}} , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773039.48819: done with _execute_module (find, {'paths': ['/sbin', '/usr/sbin', '/usr/local/sbin'], 'patterns': ['shutdown'], 'file_type': 'any', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'find', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9438 1726773039.48836: reboot: rebooting server with command '/sbin/shutdown -r 0 "Reboot initiated by Ansible"' 9438 1726773039.48839: _low_level_execute_command(): starting 9438 1726773039.48845: _low_level_execute_command(): executing: /bin/sh -c '/sbin/shutdown -r 0 "Reboot initiated by Ansible" && sleep 0' 9438 1726773039.52579: stdout chunk (state=2): >>>Shutdown scheduled for Thu 2024-09-19 15:10:39 EDT, use 'shutdown -c' to cancel. <<< 9438 1726773039.52804: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773039.52859: stderr chunk (state=3): >>><<< 9438 1726773039.52865: stdout chunk (state=3): >>><<< 9438 1726773039.52888: _low_level_execute_command() done: rc=0, stdout=Shutdown scheduled for Thu 2024-09-19 15:10:39 EDT, use 'shutdown -c' to cancel. , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773039.52904: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9438 1726773039.52906: _low_level_execute_command(): starting 9438 1726773039.52913: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9438 1726773039.62263: stderr chunk (state=2): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773039.62281: stdout chunk (state=2): >>><<< 9438 1726773039.62299: stderr chunk (state=3): >>><<< 9438 1726773039.62380: reboot: last boot time check fail 'Failed to connect to the host via ssh: Shared connection to 10.31.8.150 closed.', retrying in 1.189 seconds... 9438 1726773040.81393: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9438 1726773040.81400: _low_level_execute_command(): starting 9438 1726773040.81412: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9438 1726773050.83467: stderr chunk (state=2): >>>ssh: connect to host 10.31.8.150 port 22: Connection timed out <<< 9438 1726773050.83486: stdout chunk (state=2): >>><<< 9438 1726773050.83507: stderr chunk (state=3): >>><<< 9438 1726773050.83609: reboot: last boot time check fail 'Failed to connect to the host via ssh: ssh: connect to host 10.31.8.150 port 22: Connection timed out', retrying in 2.801 seconds... 9438 1726773053.63830: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9438 1726773053.63841: _low_level_execute_command(): starting 9438 1726773053.63850: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9438 1726773061.07061: stdout chunk (state=2): >>>d3f7bf67-0772-4448-83e2-631a7f91c2e4 <<< 9438 1726773061.07375: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773061.07442: stderr chunk (state=3): >>><<< 9438 1726773061.07449: stdout chunk (state=3): >>><<< 9438 1726773061.07471: _low_level_execute_command() done: rc=0, stdout=d3f7bf67-0772-4448-83e2-631a7f91c2e4 , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773061.07476: reboot: last boot time: d3f7bf67-0772-4448-83e2-631a7f91c2e4 9438 1726773061.07479: reboot: last boot time check success 9438 1726773061.07498: reboot: attempting post-reboot test command 'tuned-adm active' 9438 1726773061.07500: _low_level_execute_command(): starting 9438 1726773061.07506: _low_level_execute_command(): executing: /bin/sh -c 'tuned-adm active && sleep 0' 9438 1726773061.21562: stdout chunk (state=2): >>>Current active profile: virtual-guest kernel_settings <<< 9438 1726773061.23079: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 9438 1726773061.23129: stderr chunk (state=3): >>><<< 9438 1726773061.23135: stdout chunk (state=3): >>><<< 9438 1726773061.23158: _low_level_execute_command() done: rc=0, stdout=Current active profile: virtual-guest kernel_settings , stderr=Shared connection to 10.31.8.150 closed. 9438 1726773061.23166: reboot: post-reboot test command success 9438 1726773061.23181: _low_level_execute_command(): starting 9438 1726773061.23188: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.720861-9438-8084398300160/ > /dev/null 2>&1 && sleep 0' 9438 1726773061.26160: stderr chunk (state=2): >>><<< 9438 1726773061.26171: stdout chunk (state=2): >>><<< 9438 1726773061.26195: _low_level_execute_command() done: rc=0, stdout=, stderr= 9438 1726773061.26202: handler run complete 9438 1726773061.26209: attempt loop complete, returning result 9438 1726773061.26222: _execute() done 9438 1726773061.26224: dumping result to json 9438 1726773061.26226: done dumping result, returning 9438 1726773061.26237: done running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot [12a3200b-1e9d-1dbd-cc52-000000000019] 9438 1726773061.26250: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000019 9438 1726773061.26287: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000019 9438 1726773061.26328: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "elapsed": 21, "rebooted": true } 8119 1726773061.26507: no more pending results, returning what we have 8119 1726773061.26514: results queue empty 8119 1726773061.26517: checking for any_errors_fatal 8119 1726773061.26522: done checking for any_errors_fatal 8119 1726773061.26525: checking for max_fail_percentage 8119 1726773061.26528: done checking for max_fail_percentage 8119 1726773061.26531: checking to see if all hosts have failed and the running result is not ok 8119 1726773061.26533: done checking to see if all hosts have failed 8119 1726773061.26535: getting the remaining hosts for this loop 8119 1726773061.26537: done getting the remaining hosts for this loop 8119 1726773061.26545: building list of next tasks for hosts 8119 1726773061.26547: getting the next task for host managed_node2 8119 1726773061.26553: done getting next task for host managed_node2 8119 1726773061.26556: ^ task is: TASK: Check sysctl after reboot 8119 1726773061.26559: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.26561: done building task lists 8119 1726773061.26563: counting tasks in each state of execution 8119 1726773061.26566: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773061.26569: advancing hosts in ITERATING_TASKS 8119 1726773061.26571: starting to advance hosts 8119 1726773061.26572: getting the next task for host managed_node2 8119 1726773061.26575: done getting next task for host managed_node2 8119 1726773061.26576: ^ task is: TASK: Check sysctl after reboot 8119 1726773061.26578: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.26579: done advancing hosts to next task 8119 1726773061.26595: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773061.26599: getting variables 8119 1726773061.26601: in VariableManager get_vars() 8119 1726773061.26629: Calling all_inventory to load vars for managed_node2 8119 1726773061.26633: Calling groups_inventory to load vars for managed_node2 8119 1726773061.26636: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773061.26658: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26669: Calling all_plugins_play to load vars for managed_node2 8119 1726773061.26679: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26690: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773061.26705: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26716: Calling groups_plugins_play to load vars for managed_node2 8119 1726773061.26728: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26746: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26759: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.26963: done with get_vars() 8119 1726773061.26974: done getting variables 8119 1726773061.26977: sending task start callback, copying the task so we can template it temporarily 8119 1726773061.26979: done copying, going to template now 8119 1726773061.26981: done templating 8119 1726773061.26985: here goes the callback... TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:99 Thursday 19 September 2024 15:11:01 -0400 (0:00:22.583) 0:00:55.826 **** 8119 1726773061.27000: sending task start callback 8119 1726773061.27002: entering _queue_task() for managed_node2/shell 8119 1726773061.27122: worker is 1 (out of 1 available) 8119 1726773061.27160: exiting _queue_task() for managed_node2/shell 8119 1726773061.27235: done queuing things up, now waiting for results queue to drain 8119 1726773061.27241: waiting for pending results... 10394 1726773061.27295: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10394 1726773061.27345: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000001a 10394 1726773061.27392: calling self._execute() 10394 1726773061.27540: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10394 1726773061.27580: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10394 1726773061.27593: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10394 1726773061.27608: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10394 1726773061.27615: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10394 1726773061.27740: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10394 1726773061.27761: starting attempt loop 10394 1726773061.27764: running the handler 10394 1726773061.27772: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10394 1726773061.27789: _low_level_execute_command(): starting 10394 1726773061.27795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10394 1726773061.30238: stdout chunk (state=2): >>>/root <<< 10394 1726773061.30354: stderr chunk (state=3): >>><<< 10394 1726773061.30359: stdout chunk (state=3): >>><<< 10394 1726773061.30377: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10394 1726773061.30392: _low_level_execute_command(): starting 10394 1726773061.30399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532 `" && echo ansible-tmp-1726773061.303864-10394-105280600851532="` echo /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532 `" ) && sleep 0' 10394 1726773061.33159: stdout chunk (state=2): >>>ansible-tmp-1726773061.303864-10394-105280600851532=/root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532 <<< 10394 1726773061.33288: stderr chunk (state=3): >>><<< 10394 1726773061.33294: stdout chunk (state=3): >>><<< 10394 1726773061.33316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773061.303864-10394-105280600851532=/root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532 , stderr= 10394 1726773061.33451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10394 1726773061.33522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/AnsiballZ_command.py 10394 1726773061.33866: Sending initial data 10394 1726773061.33882: Sent initial data (154 bytes) 10394 1726773061.36747: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpcakgco5g /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/AnsiballZ_command.py <<< 10394 1726773061.37719: stderr chunk (state=3): >>><<< 10394 1726773061.37727: stdout chunk (state=3): >>><<< 10394 1726773061.37754: done transferring module to remote 10394 1726773061.37775: _low_level_execute_command(): starting 10394 1726773061.37785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/ /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/AnsiballZ_command.py && sleep 0' 10394 1726773061.40412: stderr chunk (state=2): >>><<< 10394 1726773061.40428: stdout chunk (state=2): >>><<< 10394 1726773061.40450: _low_level_execute_command() done: rc=0, stdout=, stderr= 10394 1726773061.40453: _low_level_execute_command(): starting 10394 1726773061.40460: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/AnsiballZ_command.py && sleep 0' 10394 1726773061.57741: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "stdout": "400000", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:00.699820", "end": "2024-09-19 15:11:00.710185", "delta": "0:00:00.010365", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10394 1726773061.58823: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10394 1726773061.58869: stderr chunk (state=3): >>><<< 10394 1726773061.58874: stdout chunk (state=3): >>><<< 10394 1726773061.58902: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "stdout": "400000", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:00.699820", "end": "2024-09-19 15:11:00.710185", "delta": "0:00:00.010365", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10394 1726773061.58943: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10394 1726773061.58957: _low_level_execute_command(): starting 10394 1726773061.58962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773061.303864-10394-105280600851532/ > /dev/null 2>&1 && sleep 0' 10394 1726773061.61574: stderr chunk (state=2): >>><<< 10394 1726773061.61589: stdout chunk (state=2): >>><<< 10394 1726773061.61608: _low_level_execute_command() done: rc=0, stdout=, stderr= 10394 1726773061.61618: handler run complete 10394 1726773061.61630: attempt loop complete, returning result 10394 1726773061.61642: _execute() done 10394 1726773061.61646: dumping result to json 10394 1726773061.61652: done dumping result, returning 10394 1726773061.61663: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [12a3200b-1e9d-1dbd-cc52-00000000001a] 10394 1726773061.61676: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001a 10394 1726773061.61715: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001a 10394 1726773061.61786: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.010365", "end": "2024-09-19 15:11:00.710185", "rc": 0, "start": "2024-09-19 15:11:00.699820" } STDOUT: 400000 8119 1726773061.61907: no more pending results, returning what we have 8119 1726773061.61915: results queue empty 8119 1726773061.61918: checking for any_errors_fatal 8119 1726773061.61923: done checking for any_errors_fatal 8119 1726773061.61924: checking for max_fail_percentage 8119 1726773061.61928: done checking for max_fail_percentage 8119 1726773061.61929: checking to see if all hosts have failed and the running result is not ok 8119 1726773061.61931: done checking to see if all hosts have failed 8119 1726773061.61933: getting the remaining hosts for this loop 8119 1726773061.61936: done getting the remaining hosts for this loop 8119 1726773061.61943: building list of next tasks for hosts 8119 1726773061.61946: getting the next task for host managed_node2 8119 1726773061.61953: done getting next task for host managed_node2 8119 1726773061.61956: ^ task is: TASK: Check sysfs after reboot 8119 1726773061.61959: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.61961: done building task lists 8119 1726773061.61963: counting tasks in each state of execution 8119 1726773061.61967: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773061.61969: advancing hosts in ITERATING_TASKS 8119 1726773061.61971: starting to advance hosts 8119 1726773061.61973: getting the next task for host managed_node2 8119 1726773061.61977: done getting next task for host managed_node2 8119 1726773061.61979: ^ task is: TASK: Check sysfs after reboot 8119 1726773061.61981: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.61986: done advancing hosts to next task 8119 1726773061.62002: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773061.62006: getting variables 8119 1726773061.62012: in VariableManager get_vars() 8119 1726773061.62043: Calling all_inventory to load vars for managed_node2 8119 1726773061.62047: Calling groups_inventory to load vars for managed_node2 8119 1726773061.62049: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773061.62073: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62086: Calling all_plugins_play to load vars for managed_node2 8119 1726773061.62099: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62108: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773061.62121: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62131: Calling groups_plugins_play to load vars for managed_node2 8119 1726773061.62143: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62161: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62175: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.62415: done with get_vars() 8119 1726773061.62425: done getting variables 8119 1726773061.62429: sending task start callback, copying the task so we can template it temporarily 8119 1726773061.62431: done copying, going to template now 8119 1726773061.62433: done templating 8119 1726773061.62434: here goes the callback... TASK [Check sysfs after reboot] ************************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:105 Thursday 19 September 2024 15:11:01 -0400 (0:00:00.354) 0:00:56.180 **** 8119 1726773061.62449: sending task start callback 8119 1726773061.62451: entering _queue_task() for managed_node2/command 8119 1726773061.62570: worker is 1 (out of 1 available) 8119 1726773061.62613: exiting _queue_task() for managed_node2/command 8119 1726773061.62688: done queuing things up, now waiting for results queue to drain 8119 1726773061.62694: waiting for pending results... 10406 1726773061.62746: running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot 10406 1726773061.62791: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000001b 10406 1726773061.62839: calling self._execute() 10406 1726773061.62987: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10406 1726773061.63031: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10406 1726773061.63043: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10406 1726773061.63058: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10406 1726773061.63065: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10406 1726773061.63196: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10406 1726773061.63217: starting attempt loop 10406 1726773061.63220: running the handler 10406 1726773061.63230: _low_level_execute_command(): starting 10406 1726773061.63234: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10406 1726773061.65666: stdout chunk (state=2): >>>/root <<< 10406 1726773061.65780: stderr chunk (state=3): >>><<< 10406 1726773061.65788: stdout chunk (state=3): >>><<< 10406 1726773061.65812: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10406 1726773061.65831: _low_level_execute_command(): starting 10406 1726773061.65838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986 `" && echo ansible-tmp-1726773061.6582134-10406-105773171194986="` echo /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986 `" ) && sleep 0' 10406 1726773061.68672: stdout chunk (state=2): >>>ansible-tmp-1726773061.6582134-10406-105773171194986=/root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986 <<< 10406 1726773061.68797: stderr chunk (state=3): >>><<< 10406 1726773061.68803: stdout chunk (state=3): >>><<< 10406 1726773061.68826: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773061.6582134-10406-105773171194986=/root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986 , stderr= 10406 1726773061.68954: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10406 1726773061.69016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/AnsiballZ_command.py 10406 1726773061.69328: Sending initial data 10406 1726773061.69343: Sent initial data (155 bytes) 10406 1726773061.71836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp9q67us_w /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/AnsiballZ_command.py <<< 10406 1726773061.72853: stderr chunk (state=3): >>><<< 10406 1726773061.72861: stdout chunk (state=3): >>><<< 10406 1726773061.72886: done transferring module to remote 10406 1726773061.72902: _low_level_execute_command(): starting 10406 1726773061.72908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/ /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/AnsiballZ_command.py && sleep 0' 10406 1726773061.75466: stderr chunk (state=2): >>><<< 10406 1726773061.75482: stdout chunk (state=2): >>><<< 10406 1726773061.75505: _low_level_execute_command() done: rc=0, stdout=, stderr= 10406 1726773061.75509: _low_level_execute_command(): starting 10406 1726773061.75519: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/AnsiballZ_command.py && sleep 0' 10406 1726773061.90472: stdout chunk (state=2): >>> {"cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "stdout": "65000", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.034406", "end": "2024-09-19 15:11:01.037478", "delta": "0:00:00.003072", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10406 1726773061.91510: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10406 1726773061.91561: stderr chunk (state=3): >>><<< 10406 1726773061.91570: stdout chunk (state=3): >>><<< 10406 1726773061.91593: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "stdout": "65000", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.034406", "end": "2024-09-19 15:11:01.037478", "delta": "0:00:00.003072", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10406 1726773061.91629: done with _execute_module (command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10406 1726773061.91645: _low_level_execute_command(): starting 10406 1726773061.91651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773061.6582134-10406-105773171194986/ > /dev/null 2>&1 && sleep 0' 10406 1726773061.94320: stderr chunk (state=2): >>><<< 10406 1726773061.94333: stdout chunk (state=2): >>><<< 10406 1726773061.94353: _low_level_execute_command() done: rc=0, stdout=, stderr= 10406 1726773061.94360: handler run complete 10406 1726773061.94370: attempt loop complete, returning result 10406 1726773061.94385: _execute() done 10406 1726773061.94388: dumping result to json 10406 1726773061.94393: done dumping result, returning 10406 1726773061.94403: done running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot [12a3200b-1e9d-1dbd-cc52-00000000001b] 10406 1726773061.94418: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001b 10406 1726773061.94457: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001b 10406 1726773061.94505: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003072", "end": "2024-09-19 15:11:01.037478", "rc": 0, "start": "2024-09-19 15:11:01.034406" } STDOUT: 65000 8119 1726773061.94713: no more pending results, returning what we have 8119 1726773061.94720: results queue empty 8119 1726773061.94722: checking for any_errors_fatal 8119 1726773061.94727: done checking for any_errors_fatal 8119 1726773061.94729: checking for max_fail_percentage 8119 1726773061.94732: done checking for max_fail_percentage 8119 1726773061.94734: checking to see if all hosts have failed and the running result is not ok 8119 1726773061.94736: done checking to see if all hosts have failed 8119 1726773061.94738: getting the remaining hosts for this loop 8119 1726773061.94741: done getting the remaining hosts for this loop 8119 1726773061.94747: building list of next tasks for hosts 8119 1726773061.94749: getting the next task for host managed_node2 8119 1726773061.94754: done getting next task for host managed_node2 8119 1726773061.94756: ^ task is: TASK: Check sysctl after reboot 8119 1726773061.94758: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.94760: done building task lists 8119 1726773061.94761: counting tasks in each state of execution 8119 1726773061.94764: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773061.94765: advancing hosts in ITERATING_TASKS 8119 1726773061.94767: starting to advance hosts 8119 1726773061.94768: getting the next task for host managed_node2 8119 1726773061.94770: done getting next task for host managed_node2 8119 1726773061.94771: ^ task is: TASK: Check sysctl after reboot 8119 1726773061.94773: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773061.94774: done advancing hosts to next task 8119 1726773061.94791: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773061.94796: getting variables 8119 1726773061.94799: in VariableManager get_vars() 8119 1726773061.94828: Calling all_inventory to load vars for managed_node2 8119 1726773061.94833: Calling groups_inventory to load vars for managed_node2 8119 1726773061.94835: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773061.94859: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.94870: Calling all_plugins_play to load vars for managed_node2 8119 1726773061.94880: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.94892: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773061.94908: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.94917: Calling groups_plugins_play to load vars for managed_node2 8119 1726773061.94927: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.94946: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.94960: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773061.95165: done with get_vars() 8119 1726773061.95175: done getting variables 8119 1726773061.95179: sending task start callback, copying the task so we can template it temporarily 8119 1726773061.95181: done copying, going to template now 8119 1726773061.95185: done templating 8119 1726773061.95187: here goes the callback... TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:109 Thursday 19 September 2024 15:11:01 -0400 (0:00:00.327) 0:00:56.508 **** 8119 1726773061.95202: sending task start callback 8119 1726773061.95204: entering _queue_task() for managed_node2/shell 8119 1726773061.95325: worker is 1 (out of 1 available) 8119 1726773061.95364: exiting _queue_task() for managed_node2/shell 8119 1726773061.95437: done queuing things up, now waiting for results queue to drain 8119 1726773061.95443: waiting for pending results... 10415 1726773061.95502: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10415 1726773061.95548: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000001c 10415 1726773061.95595: calling self._execute() 10415 1726773061.95794: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10415 1726773061.95839: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10415 1726773061.95851: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10415 1726773061.95861: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10415 1726773061.95867: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10415 1726773061.95998: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10415 1726773061.96017: starting attempt loop 10415 1726773061.96020: running the handler 10415 1726773061.96026: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10415 1726773061.96041: _low_level_execute_command(): starting 10415 1726773061.96047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10415 1726773061.98579: stdout chunk (state=2): >>>/root <<< 10415 1726773061.98699: stderr chunk (state=3): >>><<< 10415 1726773061.98704: stdout chunk (state=3): >>><<< 10415 1726773061.98726: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10415 1726773061.98740: _low_level_execute_command(): starting 10415 1726773061.98746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745 `" && echo ansible-tmp-1726773061.9873376-10415-220544097402745="` echo /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745 `" ) && sleep 0' 10415 1726773062.01489: stdout chunk (state=2): >>>ansible-tmp-1726773061.9873376-10415-220544097402745=/root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745 <<< 10415 1726773062.01619: stderr chunk (state=3): >>><<< 10415 1726773062.01624: stdout chunk (state=3): >>><<< 10415 1726773062.01642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773061.9873376-10415-220544097402745=/root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745 , stderr= 10415 1726773062.01764: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10415 1726773062.01829: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/AnsiballZ_command.py 10415 1726773062.02114: Sending initial data 10415 1726773062.02129: Sent initial data (155 bytes) 10415 1726773062.04619: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp0qb7k9a5 /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/AnsiballZ_command.py <<< 10415 1726773062.05661: stderr chunk (state=3): >>><<< 10415 1726773062.05668: stdout chunk (state=3): >>><<< 10415 1726773062.05693: done transferring module to remote 10415 1726773062.05708: _low_level_execute_command(): starting 10415 1726773062.05713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/ /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/AnsiballZ_command.py && sleep 0' 10415 1726773062.08307: stderr chunk (state=2): >>><<< 10415 1726773062.08321: stdout chunk (state=2): >>><<< 10415 1726773062.08340: _low_level_execute_command() done: rc=0, stdout=, stderr= 10415 1726773062.08344: _low_level_execute_command(): starting 10415 1726773062.08351: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/AnsiballZ_command.py && sleep 0' 10415 1726773062.23622: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.363374", "end": "2024-09-19 15:11:01.368999", "delta": "0:00:00.005625", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10415 1726773062.24702: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10415 1726773062.24755: stderr chunk (state=3): >>><<< 10415 1726773062.24761: stdout chunk (state=3): >>><<< 10415 1726773062.24787: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.363374", "end": "2024-09-19 15:11:01.368999", "delta": "0:00:00.005625", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10415 1726773062.24824: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10415 1726773062.24835: _low_level_execute_command(): starting 10415 1726773062.24841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773061.9873376-10415-220544097402745/ > /dev/null 2>&1 && sleep 0' 10415 1726773062.27509: stderr chunk (state=2): >>><<< 10415 1726773062.27526: stdout chunk (state=2): >>><<< 10415 1726773062.27549: _low_level_execute_command() done: rc=0, stdout=, stderr= 10415 1726773062.27559: handler run complete 10415 1726773062.27571: attempt loop complete, returning result 10415 1726773062.27585: _execute() done 10415 1726773062.27588: dumping result to json 10415 1726773062.27592: done dumping result, returning 10415 1726773062.27603: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [12a3200b-1e9d-1dbd-cc52-00000000001c] 10415 1726773062.27616: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001c 10415 1726773062.27657: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001c 10415 1726773062.27661: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.005625", "end": "2024-09-19 15:11:01.368999", "rc": 0, "start": "2024-09-19 15:11:01.363374" } 8119 1726773062.27902: no more pending results, returning what we have 8119 1726773062.27908: results queue empty 8119 1726773062.27910: checking for any_errors_fatal 8119 1726773062.27915: done checking for any_errors_fatal 8119 1726773062.27918: checking for max_fail_percentage 8119 1726773062.27921: done checking for max_fail_percentage 8119 1726773062.27923: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.27925: done checking to see if all hosts have failed 8119 1726773062.27927: getting the remaining hosts for this loop 8119 1726773062.27929: done getting the remaining hosts for this loop 8119 1726773062.27937: building list of next tasks for hosts 8119 1726773062.27939: getting the next task for host managed_node2 8119 1726773062.27944: done getting next task for host managed_node2 8119 1726773062.27946: ^ task is: TASK: Check with tuned verify 8119 1726773062.27949: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.27950: done building task lists 8119 1726773062.27951: counting tasks in each state of execution 8119 1726773062.27954: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.27956: advancing hosts in ITERATING_TASKS 8119 1726773062.27957: starting to advance hosts 8119 1726773062.27959: getting the next task for host managed_node2 8119 1726773062.27961: done getting next task for host managed_node2 8119 1726773062.27962: ^ task is: TASK: Check with tuned verify 8119 1726773062.27964: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.27965: done advancing hosts to next task 8119 1726773062.27978: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773062.27980: getting variables 8119 1726773062.27987: in VariableManager get_vars() 8119 1726773062.28019: Calling all_inventory to load vars for managed_node2 8119 1726773062.28024: Calling groups_inventory to load vars for managed_node2 8119 1726773062.28027: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.28050: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28062: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.28072: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28081: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.28094: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28103: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.28117: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28138: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28151: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.28355: done with get_vars() 8119 1726773062.28366: done getting variables 8119 1726773062.28370: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.28371: done copying, going to template now 8119 1726773062.28373: done templating 8119 1726773062.28375: here goes the callback... TASK [Check with tuned verify] ************************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:115 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.331) 0:00:56.840 **** 8119 1726773062.28393: sending task start callback 8119 1726773062.28395: entering _queue_task() for managed_node2/command 8119 1726773062.28518: worker is 1 (out of 1 available) 8119 1726773062.28557: exiting _queue_task() for managed_node2/command 8119 1726773062.28630: done queuing things up, now waiting for results queue to drain 8119 1726773062.28635: waiting for pending results... 10424 1726773062.28698: running TaskExecutor() for managed_node2/TASK: Check with tuned verify 10424 1726773062.28747: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000001d 10424 1726773062.28798: calling self._execute() 10424 1726773062.28956: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10424 1726773062.28999: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10424 1726773062.29016: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10424 1726773062.29027: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10424 1726773062.29035: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10424 1726773062.29161: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10424 1726773062.29185: starting attempt loop 10424 1726773062.29189: running the handler 10424 1726773062.29200: _low_level_execute_command(): starting 10424 1726773062.29204: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10424 1726773062.31686: stdout chunk (state=2): >>>/root <<< 10424 1726773062.31802: stderr chunk (state=3): >>><<< 10424 1726773062.31808: stdout chunk (state=3): >>><<< 10424 1726773062.31832: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10424 1726773062.31850: _low_level_execute_command(): starting 10424 1726773062.31859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711 `" && echo ansible-tmp-1726773062.31843-10424-274522427627711="` echo /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711 `" ) && sleep 0' 10424 1726773062.34624: stdout chunk (state=2): >>>ansible-tmp-1726773062.31843-10424-274522427627711=/root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711 <<< 10424 1726773062.34746: stderr chunk (state=3): >>><<< 10424 1726773062.34754: stdout chunk (state=3): >>><<< 10424 1726773062.34778: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773062.31843-10424-274522427627711=/root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711 , stderr= 10424 1726773062.34913: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10424 1726773062.34973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/AnsiballZ_command.py 10424 1726773062.35563: Sending initial data 10424 1726773062.35579: Sent initial data (153 bytes) 10424 1726773062.37815: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8g4tlpk3 /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/AnsiballZ_command.py <<< 10424 1726773062.38828: stderr chunk (state=3): >>><<< 10424 1726773062.38836: stdout chunk (state=3): >>><<< 10424 1726773062.38860: done transferring module to remote 10424 1726773062.38876: _low_level_execute_command(): starting 10424 1726773062.38885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/ /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/AnsiballZ_command.py && sleep 0' 10424 1726773062.41469: stderr chunk (state=2): >>><<< 10424 1726773062.41485: stdout chunk (state=2): >>><<< 10424 1726773062.41511: _low_level_execute_command() done: rc=0, stdout=, stderr= 10424 1726773062.41516: _low_level_execute_command(): starting 10424 1726773062.41523: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/AnsiballZ_command.py && sleep 0' 10424 1726773062.66903: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.695244", "end": "2024-09-19 15:11:01.801560", "delta": "0:00:00.106316", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10424 1726773062.68062: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10424 1726773062.68112: stderr chunk (state=3): >>><<< 10424 1726773062.68118: stdout chunk (state=3): >>><<< 10424 1726773062.68140: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:01.695244", "end": "2024-09-19 15:11:01.801560", "delta": "0:00:00.106316", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10424 1726773062.68178: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10424 1726773062.68193: _low_level_execute_command(): starting 10424 1726773062.68200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773062.31843-10424-274522427627711/ > /dev/null 2>&1 && sleep 0' 10424 1726773062.70904: stderr chunk (state=2): >>><<< 10424 1726773062.70922: stdout chunk (state=2): >>><<< 10424 1726773062.70945: _low_level_execute_command() done: rc=0, stdout=, stderr= 10424 1726773062.70952: handler run complete 10424 1726773062.70962: attempt loop complete, returning result 10424 1726773062.70974: _execute() done 10424 1726773062.70978: dumping result to json 10424 1726773062.70985: done dumping result, returning 10424 1726773062.70998: done running TaskExecutor() for managed_node2/TASK: Check with tuned verify [12a3200b-1e9d-1dbd-cc52-00000000001d] 10424 1726773062.71013: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001d 10424 1726773062.71055: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001d 10424 1726773062.71104: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.106316", "end": "2024-09-19 15:11:01.801560", "rc": 0, "start": "2024-09-19 15:11:01.695244" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773062.71323: no more pending results, returning what we have 8119 1726773062.71327: results queue empty 8119 1726773062.71328: checking for any_errors_fatal 8119 1726773062.71332: done checking for any_errors_fatal 8119 1726773062.71334: checking for max_fail_percentage 8119 1726773062.71336: done checking for max_fail_percentage 8119 1726773062.71337: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.71339: done checking to see if all hosts have failed 8119 1726773062.71340: getting the remaining hosts for this loop 8119 1726773062.71342: done getting the remaining hosts for this loop 8119 1726773062.71347: building list of next tasks for hosts 8119 1726773062.71349: getting the next task for host managed_node2 8119 1726773062.71355: done getting next task for host managed_node2 8119 1726773062.71357: ^ task is: TASK: Apply role again and remove settings 8119 1726773062.71359: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.71361: done building task lists 8119 1726773062.71362: counting tasks in each state of execution 8119 1726773062.71365: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.71366: advancing hosts in ITERATING_TASKS 8119 1726773062.71368: starting to advance hosts 8119 1726773062.71369: getting the next task for host managed_node2 8119 1726773062.71371: done getting next task for host managed_node2 8119 1726773062.71372: ^ task is: TASK: Apply role again and remove settings 8119 1726773062.71374: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.71375: done advancing hosts to next task 8119 1726773062.71390: getting variables 8119 1726773062.71393: in VariableManager get_vars() 8119 1726773062.71424: Calling all_inventory to load vars for managed_node2 8119 1726773062.71428: Calling groups_inventory to load vars for managed_node2 8119 1726773062.71431: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.71453: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71464: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.71475: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71486: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.71504: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71514: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.71526: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71545: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71558: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.71767: done with get_vars() 8119 1726773062.71777: done getting variables 8119 1726773062.71781: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.71786: done copying, going to template now 8119 1726773062.71789: done templating 8119 1726773062.71790: here goes the callback... TASK [Apply role again and remove settings] ************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:119 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.434) 0:00:57.274 **** 8119 1726773062.71805: sending task start callback 8119 1726773062.71807: entering _queue_task() for managed_node2/include_role 8119 1726773062.71932: worker is 1 (out of 1 available) 8119 1726773062.71971: exiting _queue_task() for managed_node2/include_role 8119 1726773062.72047: done queuing things up, now waiting for results queue to drain 8119 1726773062.72053: waiting for pending results... 10436 1726773062.72103: running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings 10436 1726773062.72151: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000001e 10436 1726773062.72197: calling self._execute() 10436 1726773062.72299: _execute() done 10436 1726773062.72305: dumping result to json 10436 1726773062.72308: done dumping result, returning 10436 1726773062.72312: done running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings [12a3200b-1e9d-1dbd-cc52-00000000001e] 10436 1726773062.72324: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001e 10436 1726773062.72353: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000001e 10436 1726773062.72356: WORKER PROCESS EXITING 8119 1726773062.72489: no more pending results, returning what we have 8119 1726773062.72497: in VariableManager get_vars() 8119 1726773062.72542: Calling all_inventory to load vars for managed_node2 8119 1726773062.72548: Calling groups_inventory to load vars for managed_node2 8119 1726773062.72553: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.72585: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72599: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.72610: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72620: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.72631: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72637: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.72647: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72671: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72695: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.72990: done with get_vars() 8119 1726773062.73865: we have included files to process 8119 1726773062.73869: generating all_blocks data 8119 1726773062.73871: done generating all_blocks data 8119 1726773062.73873: processing included file: fedora.linux_system_roles.kernel_settings 8119 1726773062.73890: in VariableManager get_vars() 8119 1726773062.73918: done with get_vars() 8119 1726773062.73969: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8119 1726773062.74019: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8119 1726773062.74041: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8119 1726773062.74099: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8119 1726773062.74480: in VariableManager get_vars() 8119 1726773062.74506: done with get_vars() 8119 1726773062.74663: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.74707: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.74819: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.74857: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.74988: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.75135: in VariableManager get_vars() 8119 1726773062.75157: done with get_vars() 8119 1726773062.75233: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8119 1726773062.75576: iterating over new_blocks loaded from include file 8119 1726773062.75581: in VariableManager get_vars() 8119 1726773062.75598: done with get_vars() 8119 1726773062.75601: filtering new block on tags 8119 1726773062.75641: done filtering new block on tags 8119 1726773062.75651: in VariableManager get_vars() 8119 1726773062.75665: done with get_vars() 8119 1726773062.75669: filtering new block on tags 8119 1726773062.75708: done filtering new block on tags 8119 1726773062.75718: in VariableManager get_vars() 8119 1726773062.75732: done with get_vars() 8119 1726773062.75734: filtering new block on tags 8119 1726773062.75842: done filtering new block on tags 8119 1726773062.75852: done iterating over new_blocks loaded from include file 8119 1726773062.75854: extending task lists for all hosts with included blocks 8119 1726773062.76698: done extending task lists 8119 1726773062.76702: done processing included files 8119 1726773062.76703: results queue empty 8119 1726773062.76705: checking for any_errors_fatal 8119 1726773062.76708: done checking for any_errors_fatal 8119 1726773062.76711: checking for max_fail_percentage 8119 1726773062.76713: done checking for max_fail_percentage 8119 1726773062.76714: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.76716: done checking to see if all hosts have failed 8119 1726773062.76718: getting the remaining hosts for this loop 8119 1726773062.76719: done getting the remaining hosts for this loop 8119 1726773062.76725: building list of next tasks for hosts 8119 1726773062.76728: getting the next task for host managed_node2 8119 1726773062.76733: done getting next task for host managed_node2 8119 1726773062.76735: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773062.76738: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.76739: done building task lists 8119 1726773062.76741: counting tasks in each state of execution 8119 1726773062.76743: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.76744: advancing hosts in ITERATING_TASKS 8119 1726773062.76746: starting to advance hosts 8119 1726773062.76747: getting the next task for host managed_node2 8119 1726773062.76750: done getting next task for host managed_node2 8119 1726773062.76752: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773062.76753: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.76755: done advancing hosts to next task 8119 1726773062.76761: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773062.76764: getting variables 8119 1726773062.76765: in VariableManager get_vars() 8119 1726773062.76779: Calling all_inventory to load vars for managed_node2 8119 1726773062.76782: Calling groups_inventory to load vars for managed_node2 8119 1726773062.76786: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.76802: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.76812: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.76823: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.76832: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.76847: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.76854: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.76864: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.76881: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.76899: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.77090: done with get_vars() 8119 1726773062.77100: done getting variables 8119 1726773062.77105: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.77106: done copying, going to template now 8119 1726773062.77108: done templating 8119 1726773062.77112: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.053) 0:00:57.327 **** 8119 1726773062.77128: sending task start callback 8119 1726773062.77130: entering _queue_task() for managed_node2/fail 8119 1726773062.77272: worker is 1 (out of 1 available) 8119 1726773062.77315: exiting _queue_task() for managed_node2/fail 8119 1726773062.77392: done queuing things up, now waiting for results queue to drain 8119 1726773062.77398: waiting for pending results... 10438 1726773062.77452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10438 1726773062.77507: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002e8 10438 1726773062.77556: calling self._execute() 10438 1726773062.79279: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10438 1726773062.79371: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10438 1726773062.79426: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10438 1726773062.79453: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10438 1726773062.79479: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10438 1726773062.79513: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10438 1726773062.79558: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10438 1726773062.79581: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10438 1726773062.79603: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10438 1726773062.79685: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10438 1726773062.79703: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10438 1726773062.79731: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10438 1726773062.80367: when evaluation is False, skipping this task 10438 1726773062.80371: _execute() done 10438 1726773062.80373: dumping result to json 10438 1726773062.80375: done dumping result, returning 10438 1726773062.80380: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [12a3200b-1e9d-1dbd-cc52-0000000002e8] 10438 1726773062.80390: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002e8 10438 1726773062.80416: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002e8 10438 1726773062.80419: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773062.80603: no more pending results, returning what we have 8119 1726773062.80608: results queue empty 8119 1726773062.80612: checking for any_errors_fatal 8119 1726773062.80620: done checking for any_errors_fatal 8119 1726773062.80622: checking for max_fail_percentage 8119 1726773062.80625: done checking for max_fail_percentage 8119 1726773062.80627: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.80629: done checking to see if all hosts have failed 8119 1726773062.80630: getting the remaining hosts for this loop 8119 1726773062.80633: done getting the remaining hosts for this loop 8119 1726773062.80641: building list of next tasks for hosts 8119 1726773062.80644: getting the next task for host managed_node2 8119 1726773062.80652: done getting next task for host managed_node2 8119 1726773062.80657: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773062.80661: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.80664: done building task lists 8119 1726773062.80666: counting tasks in each state of execution 8119 1726773062.80670: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.80672: advancing hosts in ITERATING_TASKS 8119 1726773062.80674: starting to advance hosts 8119 1726773062.80676: getting the next task for host managed_node2 8119 1726773062.80680: done getting next task for host managed_node2 8119 1726773062.80685: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773062.80690: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.80692: done advancing hosts to next task 8119 1726773062.80705: getting variables 8119 1726773062.80708: in VariableManager get_vars() 8119 1726773062.80738: Calling all_inventory to load vars for managed_node2 8119 1726773062.80742: Calling groups_inventory to load vars for managed_node2 8119 1726773062.80744: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.80765: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.80776: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.80788: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.80800: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.80816: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.80824: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.80833: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.80851: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.80865: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.81097: done with get_vars() 8119 1726773062.81108: done getting variables 8119 1726773062.81114: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.81117: done copying, going to template now 8119 1726773062.81119: done templating 8119 1726773062.81120: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.040) 0:00:57.367 **** 8119 1726773062.81136: sending task start callback 8119 1726773062.81139: entering _queue_task() for managed_node2/include_tasks 8119 1726773062.81263: worker is 1 (out of 1 available) 8119 1726773062.81303: exiting _queue_task() for managed_node2/include_tasks 8119 1726773062.81375: done queuing things up, now waiting for results queue to drain 8119 1726773062.81380: waiting for pending results... 10440 1726773062.81434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10440 1726773062.81489: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002e9 10440 1726773062.81535: calling self._execute() 10440 1726773062.81638: _execute() done 10440 1726773062.81643: dumping result to json 10440 1726773062.81645: done dumping result, returning 10440 1726773062.81649: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [12a3200b-1e9d-1dbd-cc52-0000000002e9] 10440 1726773062.81658: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002e9 10440 1726773062.81689: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002e9 10440 1726773062.81726: WORKER PROCESS EXITING 8119 1726773062.81819: no more pending results, returning what we have 8119 1726773062.81827: in VariableManager get_vars() 8119 1726773062.81871: Calling all_inventory to load vars for managed_node2 8119 1726773062.81877: Calling groups_inventory to load vars for managed_node2 8119 1726773062.81880: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.81918: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.81934: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.81951: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.81962: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.81973: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.81979: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.81991: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.82012: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.82027: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.82240: done with get_vars() 8119 1726773062.82278: we have included files to process 8119 1726773062.82281: generating all_blocks data 8119 1726773062.82287: done generating all_blocks data 8119 1726773062.82292: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773062.82295: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773062.82298: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773062.82430: plugin lookup for setup failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.82509: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773062.82591: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8119 1726773062.82707: done processing included file 8119 1726773062.82711: iterating over new_blocks loaded from include file 8119 1726773062.82714: in VariableManager get_vars() 8119 1726773062.82733: done with get_vars() 8119 1726773062.82735: filtering new block on tags 8119 1726773062.82779: done filtering new block on tags 8119 1726773062.82790: in VariableManager get_vars() 8119 1726773062.82808: done with get_vars() 8119 1726773062.82813: filtering new block on tags 8119 1726773062.82855: done filtering new block on tags 8119 1726773062.82864: in VariableManager get_vars() 8119 1726773062.82881: done with get_vars() 8119 1726773062.82886: filtering new block on tags 8119 1726773062.82930: done filtering new block on tags 8119 1726773062.82937: in VariableManager get_vars() 8119 1726773062.82954: done with get_vars() 8119 1726773062.82956: filtering new block on tags 8119 1726773062.83005: done filtering new block on tags 8119 1726773062.83017: done iterating over new_blocks loaded from include file 8119 1726773062.83019: extending task lists for all hosts with included blocks 8119 1726773062.83107: done extending task lists 8119 1726773062.83114: done processing included files 8119 1726773062.83116: results queue empty 8119 1726773062.83117: checking for any_errors_fatal 8119 1726773062.83120: done checking for any_errors_fatal 8119 1726773062.83122: checking for max_fail_percentage 8119 1726773062.83123: done checking for max_fail_percentage 8119 1726773062.83124: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.83126: done checking to see if all hosts have failed 8119 1726773062.83127: getting the remaining hosts for this loop 8119 1726773062.83128: done getting the remaining hosts for this loop 8119 1726773062.83132: building list of next tasks for hosts 8119 1726773062.83134: getting the next task for host managed_node2 8119 1726773062.83138: done getting next task for host managed_node2 8119 1726773062.83140: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773062.83144: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.83145: done building task lists 8119 1726773062.83147: counting tasks in each state of execution 8119 1726773062.83149: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.83150: advancing hosts in ITERATING_TASKS 8119 1726773062.83152: starting to advance hosts 8119 1726773062.83153: getting the next task for host managed_node2 8119 1726773062.83156: done getting next task for host managed_node2 8119 1726773062.83158: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773062.83160: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.83161: done advancing hosts to next task 8119 1726773062.83167: getting variables 8119 1726773062.83169: in VariableManager get_vars() 8119 1726773062.83181: Calling all_inventory to load vars for managed_node2 8119 1726773062.83186: Calling groups_inventory to load vars for managed_node2 8119 1726773062.83188: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.83201: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83209: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.83224: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83235: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.83246: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83252: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.83261: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83278: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83294: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.83478: done with get_vars() 8119 1726773062.83492: done getting variables 8119 1726773062.83498: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.83499: done copying, going to template now 8119 1726773062.83501: done templating 8119 1726773062.83502: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.023) 0:00:57.391 **** 8119 1726773062.83521: sending task start callback 8119 1726773062.83522: entering _queue_task() for managed_node2/setup 8119 1726773062.83643: worker is 1 (out of 1 available) 8119 1726773062.83680: exiting _queue_task() for managed_node2/setup 8119 1726773062.83753: done queuing things up, now waiting for results queue to drain 8119 1726773062.83758: waiting for pending results... 10442 1726773062.83813: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10442 1726773062.83867: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000003fe 10442 1726773062.83915: calling self._execute() 10442 1726773062.85608: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10442 1726773062.85691: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10442 1726773062.85742: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10442 1726773062.85933: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10442 1726773062.85961: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10442 1726773062.85996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10442 1726773062.86039: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10442 1726773062.86060: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10442 1726773062.86076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10442 1726773062.86154: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10442 1726773062.86170: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10442 1726773062.86186: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10442 1726773062.86575: when evaluation is False, skipping this task 10442 1726773062.86581: _execute() done 10442 1726773062.86586: dumping result to json 10442 1726773062.86588: done dumping result, returning 10442 1726773062.86592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [12a3200b-1e9d-1dbd-cc52-0000000003fe] 10442 1726773062.86600: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000003fe 10442 1726773062.86624: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000003fe 10442 1726773062.86628: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773062.86752: no more pending results, returning what we have 8119 1726773062.86756: results queue empty 8119 1726773062.86757: checking for any_errors_fatal 8119 1726773062.86759: done checking for any_errors_fatal 8119 1726773062.86761: checking for max_fail_percentage 8119 1726773062.86763: done checking for max_fail_percentage 8119 1726773062.86765: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.86766: done checking to see if all hosts have failed 8119 1726773062.86767: getting the remaining hosts for this loop 8119 1726773062.86769: done getting the remaining hosts for this loop 8119 1726773062.86774: building list of next tasks for hosts 8119 1726773062.86776: getting the next task for host managed_node2 8119 1726773062.86788: done getting next task for host managed_node2 8119 1726773062.86793: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773062.86798: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.86801: done building task lists 8119 1726773062.86803: counting tasks in each state of execution 8119 1726773062.86807: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.86809: advancing hosts in ITERATING_TASKS 8119 1726773062.86813: starting to advance hosts 8119 1726773062.86815: getting the next task for host managed_node2 8119 1726773062.86819: done getting next task for host managed_node2 8119 1726773062.86821: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773062.86824: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.86825: done advancing hosts to next task 8119 1726773062.86839: getting variables 8119 1726773062.86843: in VariableManager get_vars() 8119 1726773062.86873: Calling all_inventory to load vars for managed_node2 8119 1726773062.86877: Calling groups_inventory to load vars for managed_node2 8119 1726773062.86879: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.86902: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.86918: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.86937: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.86951: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.86966: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.86974: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.86990: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.87020: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.87041: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.87291: done with get_vars() 8119 1726773062.87300: done getting variables 8119 1726773062.87305: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.87307: done copying, going to template now 8119 1726773062.87309: done templating 8119 1726773062.87312: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.038) 0:00:57.429 **** 8119 1726773062.87328: sending task start callback 8119 1726773062.87330: entering _queue_task() for managed_node2/stat 8119 1726773062.87445: worker is 1 (out of 1 available) 8119 1726773062.87481: exiting _queue_task() for managed_node2/stat 8119 1726773062.87553: done queuing things up, now waiting for results queue to drain 8119 1726773062.87558: waiting for pending results... 10444 1726773062.87612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10444 1726773062.87666: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000400 10444 1726773062.87712: calling self._execute() 10444 1726773062.89539: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10444 1726773062.89617: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10444 1726773062.89678: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10444 1726773062.89706: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10444 1726773062.89736: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10444 1726773062.89765: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10444 1726773062.89807: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10444 1726773062.89830: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10444 1726773062.89849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10444 1726773062.89926: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10444 1726773062.89944: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10444 1726773062.89966: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10444 1726773062.90221: when evaluation is False, skipping this task 10444 1726773062.90226: _execute() done 10444 1726773062.90228: dumping result to json 10444 1726773062.90229: done dumping result, returning 10444 1726773062.90233: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [12a3200b-1e9d-1dbd-cc52-000000000400] 10444 1726773062.90240: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000400 10444 1726773062.90263: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000400 10444 1726773062.90266: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773062.90412: no more pending results, returning what we have 8119 1726773062.90417: results queue empty 8119 1726773062.90420: checking for any_errors_fatal 8119 1726773062.90423: done checking for any_errors_fatal 8119 1726773062.90427: checking for max_fail_percentage 8119 1726773062.90430: done checking for max_fail_percentage 8119 1726773062.90432: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.90434: done checking to see if all hosts have failed 8119 1726773062.90436: getting the remaining hosts for this loop 8119 1726773062.90438: done getting the remaining hosts for this loop 8119 1726773062.90445: building list of next tasks for hosts 8119 1726773062.90448: getting the next task for host managed_node2 8119 1726773062.90455: done getting next task for host managed_node2 8119 1726773062.90460: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773062.90465: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.90468: done building task lists 8119 1726773062.90470: counting tasks in each state of execution 8119 1726773062.90474: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.90476: advancing hosts in ITERATING_TASKS 8119 1726773062.90478: starting to advance hosts 8119 1726773062.90480: getting the next task for host managed_node2 8119 1726773062.90486: done getting next task for host managed_node2 8119 1726773062.90490: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773062.90494: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.90496: done advancing hosts to next task 8119 1726773062.90512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773062.90517: getting variables 8119 1726773062.90520: in VariableManager get_vars() 8119 1726773062.90546: Calling all_inventory to load vars for managed_node2 8119 1726773062.90549: Calling groups_inventory to load vars for managed_node2 8119 1726773062.90552: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.90571: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90581: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.90594: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90603: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.90615: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90622: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.90631: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90649: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90662: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.90889: done with get_vars() 8119 1726773062.90900: done getting variables 8119 1726773062.90905: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.90906: done copying, going to template now 8119 1726773062.90908: done templating 8119 1726773062.90912: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.036) 0:00:57.465 **** 8119 1726773062.90931: sending task start callback 8119 1726773062.90933: entering _queue_task() for managed_node2/set_fact 8119 1726773062.91042: worker is 1 (out of 1 available) 8119 1726773062.91077: exiting _queue_task() for managed_node2/set_fact 8119 1726773062.91150: done queuing things up, now waiting for results queue to drain 8119 1726773062.91156: waiting for pending results... 10446 1726773062.91212: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10446 1726773062.91266: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000401 10446 1726773062.91313: calling self._execute() 10446 1726773062.93112: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10446 1726773062.93195: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10446 1726773062.93249: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10446 1726773062.93276: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10446 1726773062.93303: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10446 1726773062.93332: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10446 1726773062.93376: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10446 1726773062.93401: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10446 1726773062.93419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10446 1726773062.93508: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10446 1726773062.93527: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10446 1726773062.93541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10446 1726773062.93793: when evaluation is False, skipping this task 10446 1726773062.93798: _execute() done 10446 1726773062.93800: dumping result to json 10446 1726773062.93803: done dumping result, returning 10446 1726773062.93808: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-000000000401] 10446 1726773062.93817: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000401 10446 1726773062.93846: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000401 10446 1726773062.93850: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773062.94160: no more pending results, returning what we have 8119 1726773062.94163: results queue empty 8119 1726773062.94165: checking for any_errors_fatal 8119 1726773062.94167: done checking for any_errors_fatal 8119 1726773062.94169: checking for max_fail_percentage 8119 1726773062.94171: done checking for max_fail_percentage 8119 1726773062.94172: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.94173: done checking to see if all hosts have failed 8119 1726773062.94175: getting the remaining hosts for this loop 8119 1726773062.94176: done getting the remaining hosts for this loop 8119 1726773062.94181: building list of next tasks for hosts 8119 1726773062.94185: getting the next task for host managed_node2 8119 1726773062.94192: done getting next task for host managed_node2 8119 1726773062.94195: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773062.94198: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.94200: done building task lists 8119 1726773062.94202: counting tasks in each state of execution 8119 1726773062.94204: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.94206: advancing hosts in ITERATING_TASKS 8119 1726773062.94207: starting to advance hosts 8119 1726773062.94208: getting the next task for host managed_node2 8119 1726773062.94214: done getting next task for host managed_node2 8119 1726773062.94216: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773062.94218: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.94220: done advancing hosts to next task 8119 1726773062.94233: getting variables 8119 1726773062.94236: in VariableManager get_vars() 8119 1726773062.94261: Calling all_inventory to load vars for managed_node2 8119 1726773062.94264: Calling groups_inventory to load vars for managed_node2 8119 1726773062.94266: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.94288: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94299: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.94309: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94321: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.94332: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94339: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.94351: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94370: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94385: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.94587: done with get_vars() 8119 1726773062.94597: done getting variables 8119 1726773062.94602: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.94604: done copying, going to template now 8119 1726773062.94606: done templating 8119 1726773062.94607: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.036) 0:00:57.502 **** 8119 1726773062.94626: sending task start callback 8119 1726773062.94628: entering _queue_task() for managed_node2/stat 8119 1726773062.94747: worker is 1 (out of 1 available) 8119 1726773062.94785: exiting _queue_task() for managed_node2/stat 8119 1726773062.94857: done queuing things up, now waiting for results queue to drain 8119 1726773062.94862: waiting for pending results... 10448 1726773062.94918: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10448 1726773062.94973: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000403 10448 1726773062.95020: calling self._execute() 10448 1726773062.96699: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10448 1726773062.96779: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10448 1726773062.96837: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10448 1726773062.96871: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10448 1726773062.96899: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10448 1726773062.96937: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10448 1726773062.96985: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10448 1726773062.97009: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10448 1726773062.97026: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10448 1726773062.97109: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10448 1726773062.97130: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10448 1726773062.97145: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10448 1726773062.97392: when evaluation is False, skipping this task 10448 1726773062.97396: _execute() done 10448 1726773062.97398: dumping result to json 10448 1726773062.97400: done dumping result, returning 10448 1726773062.97404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [12a3200b-1e9d-1dbd-cc52-000000000403] 10448 1726773062.97412: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000403 10448 1726773062.97438: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000403 10448 1726773062.97441: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773062.97620: no more pending results, returning what we have 8119 1726773062.97625: results queue empty 8119 1726773062.97627: checking for any_errors_fatal 8119 1726773062.97632: done checking for any_errors_fatal 8119 1726773062.97634: checking for max_fail_percentage 8119 1726773062.97637: done checking for max_fail_percentage 8119 1726773062.97639: checking to see if all hosts have failed and the running result is not ok 8119 1726773062.97641: done checking to see if all hosts have failed 8119 1726773062.97643: getting the remaining hosts for this loop 8119 1726773062.97645: done getting the remaining hosts for this loop 8119 1726773062.97652: building list of next tasks for hosts 8119 1726773062.97655: getting the next task for host managed_node2 8119 1726773062.97662: done getting next task for host managed_node2 8119 1726773062.97667: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773062.97672: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.97674: done building task lists 8119 1726773062.97676: counting tasks in each state of execution 8119 1726773062.97680: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773062.97685: advancing hosts in ITERATING_TASKS 8119 1726773062.97688: starting to advance hosts 8119 1726773062.97690: getting the next task for host managed_node2 8119 1726773062.97693: done getting next task for host managed_node2 8119 1726773062.97695: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773062.97698: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773062.97699: done advancing hosts to next task 8119 1726773062.97712: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773062.97715: getting variables 8119 1726773062.97718: in VariableManager get_vars() 8119 1726773062.97743: Calling all_inventory to load vars for managed_node2 8119 1726773062.97746: Calling groups_inventory to load vars for managed_node2 8119 1726773062.97748: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773062.97768: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.97779: Calling all_plugins_play to load vars for managed_node2 8119 1726773062.97795: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.97808: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773062.97823: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.97830: Calling groups_plugins_play to load vars for managed_node2 8119 1726773062.97839: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.97857: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.97870: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773062.98081: done with get_vars() 8119 1726773062.98094: done getting variables 8119 1726773062.98099: sending task start callback, copying the task so we can template it temporarily 8119 1726773062.98100: done copying, going to template now 8119 1726773062.98102: done templating 8119 1726773062.98103: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.034) 0:00:57.537 **** 8119 1726773062.98122: sending task start callback 8119 1726773062.98123: entering _queue_task() for managed_node2/set_fact 8119 1726773062.98234: worker is 1 (out of 1 available) 8119 1726773062.98264: exiting _queue_task() for managed_node2/set_fact 8119 1726773062.98324: done queuing things up, now waiting for results queue to drain 8119 1726773062.98328: waiting for pending results... 10450 1726773062.98506: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10450 1726773062.98563: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000404 10450 1726773062.98613: calling self._execute() 10450 1726773063.00381: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10450 1726773063.00467: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10450 1726773063.00544: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10450 1726773063.00581: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10450 1726773063.00623: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10450 1726773063.00661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10450 1726773063.00724: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10450 1726773063.00758: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10450 1726773063.00782: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10450 1726773063.00889: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10450 1726773063.00913: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10450 1726773063.00931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10450 1726773063.01275: when evaluation is False, skipping this task 10450 1726773063.01282: _execute() done 10450 1726773063.01287: dumping result to json 10450 1726773063.01291: done dumping result, returning 10450 1726773063.01297: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [12a3200b-1e9d-1dbd-cc52-000000000404] 10450 1726773063.01312: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000404 10450 1726773063.01341: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000404 10450 1726773063.01345: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773063.01792: no more pending results, returning what we have 8119 1726773063.01797: results queue empty 8119 1726773063.01799: checking for any_errors_fatal 8119 1726773063.01803: done checking for any_errors_fatal 8119 1726773063.01804: checking for max_fail_percentage 8119 1726773063.01806: done checking for max_fail_percentage 8119 1726773063.01807: checking to see if all hosts have failed and the running result is not ok 8119 1726773063.01809: done checking to see if all hosts have failed 8119 1726773063.01810: getting the remaining hosts for this loop 8119 1726773063.01813: done getting the remaining hosts for this loop 8119 1726773063.01819: building list of next tasks for hosts 8119 1726773063.01820: getting the next task for host managed_node2 8119 1726773063.01828: done getting next task for host managed_node2 8119 1726773063.01832: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773063.01835: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773063.01837: done building task lists 8119 1726773063.01839: counting tasks in each state of execution 8119 1726773063.01842: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773063.01843: advancing hosts in ITERATING_TASKS 8119 1726773063.01844: starting to advance hosts 8119 1726773063.01846: getting the next task for host managed_node2 8119 1726773063.01850: done getting next task for host managed_node2 8119 1726773063.01852: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773063.01854: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773063.01856: done advancing hosts to next task 8119 1726773063.01868: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773063.01871: getting variables 8119 1726773063.01873: in VariableManager get_vars() 8119 1726773063.01905: Calling all_inventory to load vars for managed_node2 8119 1726773063.01912: Calling groups_inventory to load vars for managed_node2 8119 1726773063.01917: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773063.01940: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.01950: Calling all_plugins_play to load vars for managed_node2 8119 1726773063.01960: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.01969: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773063.01979: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.01987: Calling groups_plugins_play to load vars for managed_node2 8119 1726773063.01999: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.02018: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.02037: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.02252: done with get_vars() 8119 1726773063.02265: done getting variables 8119 1726773063.02271: sending task start callback, copying the task so we can template it temporarily 8119 1726773063.02273: done copying, going to template now 8119 1726773063.02274: done templating 8119 1726773063.02276: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:03 -0400 (0:00:00.041) 0:00:57.579 **** 8119 1726773063.02295: sending task start callback 8119 1726773063.02297: entering _queue_task() for managed_node2/include_vars 8119 1726773063.02421: worker is 1 (out of 1 available) 8119 1726773063.02459: exiting _queue_task() for managed_node2/include_vars 8119 1726773063.02531: done queuing things up, now waiting for results queue to drain 8119 1726773063.02536: waiting for pending results... 10453 1726773063.02601: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10453 1726773063.02660: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000406 10453 1726773063.02706: calling self._execute() 10453 1726773063.04759: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10453 1726773063.04891: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10453 1726773063.04963: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10453 1726773063.05003: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10453 1726773063.05046: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10453 1726773063.05086: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10453 1726773063.05147: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10453 1726773063.05179: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10453 1726773063.05205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10453 1726773063.05316: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10453 1726773063.05340: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10453 1726773063.05363: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10453 1726773063.06434: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup 10453 1726773063.06641: Loaded config def from plugin (lookup/first_found) 10453 1726773063.06647: Loading LookupModule 'first_found' from /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup/first_found.py 10453 1726773063.06716: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10453 1726773063.06762: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10453 1726773063.06773: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10453 1726773063.06788: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10453 1726773063.06795: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10453 1726773063.06913: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10453 1726773063.06930: starting attempt loop 10453 1726773063.06933: running the handler 10453 1726773063.06978: handler run complete 10453 1726773063.06986: attempt loop complete, returning result 10453 1726773063.06989: _execute() done 10453 1726773063.06992: dumping result to json 10453 1726773063.06995: done dumping result, returning 10453 1726773063.07001: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [12a3200b-1e9d-1dbd-cc52-000000000406] 10453 1726773063.07009: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000406 10453 1726773063.07036: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000406 10453 1726773063.07040: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8119 1726773063.07434: no more pending results, returning what we have 8119 1726773063.07441: results queue empty 8119 1726773063.07444: checking for any_errors_fatal 8119 1726773063.07448: done checking for any_errors_fatal 8119 1726773063.07450: checking for max_fail_percentage 8119 1726773063.07454: done checking for max_fail_percentage 8119 1726773063.07456: checking to see if all hosts have failed and the running result is not ok 8119 1726773063.07458: done checking to see if all hosts have failed 8119 1726773063.07460: getting the remaining hosts for this loop 8119 1726773063.07463: done getting the remaining hosts for this loop 8119 1726773063.07471: building list of next tasks for hosts 8119 1726773063.07474: getting the next task for host managed_node2 8119 1726773063.07484: done getting next task for host managed_node2 8119 1726773063.07490: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773063.07496: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773063.07499: done building task lists 8119 1726773063.07501: counting tasks in each state of execution 8119 1726773063.07506: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773063.07508: advancing hosts in ITERATING_TASKS 8119 1726773063.07513: starting to advance hosts 8119 1726773063.07515: getting the next task for host managed_node2 8119 1726773063.07522: done getting next task for host managed_node2 8119 1726773063.07526: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773063.07529: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773063.07532: done advancing hosts to next task 8119 1726773063.07548: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773063.07554: getting variables 8119 1726773063.07558: in VariableManager get_vars() 8119 1726773063.07597: Calling all_inventory to load vars for managed_node2 8119 1726773063.07604: Calling groups_inventory to load vars for managed_node2 8119 1726773063.07608: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773063.07641: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.07657: Calling all_plugins_play to load vars for managed_node2 8119 1726773063.07675: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.07693: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773063.07715: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.07727: Calling groups_plugins_play to load vars for managed_node2 8119 1726773063.07745: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.07785: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.07814: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773063.08162: done with get_vars() 8119 1726773063.08177: done getting variables 8119 1726773063.08187: sending task start callback, copying the task so we can template it temporarily 8119 1726773063.08191: done copying, going to template now 8119 1726773063.08194: done templating 8119 1726773063.08196: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:03 -0400 (0:00:00.059) 0:00:57.638 **** 8119 1726773063.08223: sending task start callback 8119 1726773063.08227: entering _queue_task() for managed_node2/package 8119 1726773063.08381: worker is 1 (out of 1 available) 8119 1726773063.08433: exiting _queue_task() for managed_node2/package 8119 1726773063.08508: done queuing things up, now waiting for results queue to drain 8119 1726773063.08516: waiting for pending results... 10457 1726773063.08753: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10457 1726773063.08824: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002ea 10457 1726773063.08881: calling self._execute() 10457 1726773063.11681: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10457 1726773063.11801: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10457 1726773063.11866: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10457 1726773063.11902: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10457 1726773063.11939: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10457 1726773063.11974: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10457 1726773063.12029: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10457 1726773063.12057: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10457 1726773063.12077: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10457 1726773063.12178: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10457 1726773063.12203: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10457 1726773063.12223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10457 1726773063.12414: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10457 1726773063.12419: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10457 1726773063.12423: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10457 1726773063.12425: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10457 1726773063.12428: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10457 1726773063.12431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10457 1726773063.12434: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10457 1726773063.12437: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10457 1726773063.12439: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10457 1726773063.12459: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10457 1726773063.12462: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10457 1726773063.12465: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10457 1726773063.12694: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10457 1726773063.12744: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10457 1726773063.12759: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10457 1726773063.12779: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10457 1726773063.12790: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10457 1726773063.12920: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10457 1726773063.12942: starting attempt loop 10457 1726773063.12946: running the handler 10457 1726773063.13123: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 10457 1726773063.13139: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 10457 1726773063.13149: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 10457 1726773063.13161: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 10457 1726773063.13175: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 10457 1726773063.13208: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 10457 1726773063.13230: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 10457 1726773063.13240: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 10457 1726773063.13248: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 10457 1726773063.13254: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 10457 1726773063.13262: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 10457 1726773063.13272: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 10457 1726773063.13296: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 10457 1726773063.13310: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 10457 1726773063.13472: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 10457 1726773063.13486: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 10457 1726773063.13513: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 10457 1726773063.13525: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 10457 1726773063.13537: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 10457 1726773063.13639: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 10457 1726773063.13652: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 10457 1726773063.13804: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 10457 1726773063.13814: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 10457 1726773063.13825: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 10457 1726773063.13928: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 10457 1726773063.13938: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 10457 1726773063.13986: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 10457 1726773063.14006: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 10457 1726773063.14019: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 10457 1726773063.14030: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 10457 1726773063.14041: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 10457 1726773063.14052: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 10457 1726773063.14062: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 10457 1726773063.14073: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 10457 1726773063.14115: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 10457 1726773063.14126: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 10457 1726773063.14138: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 10457 1726773063.14395: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 10457 1726773063.14407: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 10457 1726773063.14416: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 10457 1726773063.14461: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 10457 1726773063.15122: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 10457 1726773063.15134: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 10457 1726773063.15145: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 10457 1726773063.15170: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 10457 1726773063.15191: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 10457 1726773063.15202: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 10457 1726773063.15213: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 10457 1726773063.15245: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 10457 1726773063.15264: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 10457 1726773063.15270: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 10457 1726773063.15276: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 10457 1726773063.15314: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 10457 1726773063.15326: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 10457 1726773063.15334: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 10457 1726773063.15367: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 10457 1726773063.15376: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 10457 1726773063.15388: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 10457 1726773063.15414: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 10457 1726773063.15499: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 10457 1726773063.15511: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 10457 1726773063.15526: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 10457 1726773063.15534: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 10457 1726773063.15659: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 10457 1726773063.15708: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 10457 1726773063.15718: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 10457 1726773063.15727: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 10457 1726773063.15739: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 10457 1726773063.15780: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 10457 1726773063.15792: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 10457 1726773063.15803: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 10457 1726773063.15812: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 10457 1726773063.15821: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 10457 1726773063.15829: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 10457 1726773063.15839: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 10457 1726773063.15858: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 10457 1726773063.15868: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 10457 1726773063.15881: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 10457 1726773063.15893: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 10457 1726773063.15929: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 10457 1726773063.15937: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 10457 1726773063.15946: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 10457 1726773063.16051: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 10457 1726773063.16061: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 10457 1726773063.16078: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 10457 1726773063.16087: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 10457 1726773063.16097: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 10457 1726773063.16161: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 10457 1726773063.16171: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 10457 1726773063.16260: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 10457 1726773063.16268: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 10457 1726773063.16275: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 10457 1726773063.16346: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 10457 1726773063.16355: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 10457 1726773063.16389: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 10457 1726773063.16405: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 10457 1726773063.16414: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 10457 1726773063.16423: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 10457 1726773063.16432: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 10457 1726773063.16441: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 10457 1726773063.16448: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 10457 1726773063.16457: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 10457 1726773063.16489: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 10457 1726773063.16499: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 10457 1726773063.16509: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 10457 1726773063.16690: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 10457 1726773063.16701: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 10457 1726773063.16708: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 10457 1726773063.16740: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 10457 1726773063.17160: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 10457 1726773063.17171: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 10457 1726773063.17181: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 10457 1726773063.17203: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 10457 1726773063.17220: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 10457 1726773063.17228: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 10457 1726773063.17237: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 10457 1726773063.17267: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 10457 1726773063.17293: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 10457 1726773063.17303: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 10457 1726773063.17311: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 10457 1726773063.17344: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 10457 1726773063.17354: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 10457 1726773063.17361: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 10457 1726773063.17386: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 10457 1726773063.17393: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 10457 1726773063.17400: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 10457 1726773063.17418: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 10457 1726773063.17467: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 10457 1726773063.17474: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 10457 1726773063.17485: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 10457 1726773063.17491: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 10457 1726773063.17569: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 10457 1726773063.17598: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 10457 1726773063.17605: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 10457 1726773063.17612: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 10457 1726773063.17620: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 10457 1726773063.17645: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 10457 1726773063.17651: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 10457 1726773063.17658: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 10457 1726773063.17664: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 10457 1726773063.17670: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 10457 1726773063.17675: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 10457 1726773063.17685: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 10457 1726773063.17699: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 10457 1726773063.17706: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 10457 1726773063.17715: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 10457 1726773063.17722: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 10457 1726773063.17744: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 10457 1726773063.17777: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 10457 1726773063.17803: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 10457 1726773063.17885: _low_level_execute_command(): starting 10457 1726773063.17893: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10457 1726773063.20697: stdout chunk (state=2): >>>/root <<< 10457 1726773063.20956: stderr chunk (state=3): >>><<< 10457 1726773063.20965: stdout chunk (state=3): >>><<< 10457 1726773063.20998: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10457 1726773063.21023: _low_level_execute_command(): starting 10457 1726773063.21032: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360 `" && echo ansible-tmp-1726773063.2101421-10457-78763079155360="` echo /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360 `" ) && sleep 0' 10457 1726773063.24141: stdout chunk (state=2): >>>ansible-tmp-1726773063.2101421-10457-78763079155360=/root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360 <<< 10457 1726773063.24299: stderr chunk (state=3): >>><<< 10457 1726773063.24308: stdout chunk (state=3): >>><<< 10457 1726773063.24338: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773063.2101421-10457-78763079155360=/root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360 , stderr= 10457 1726773063.24482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/dnf-ZIP_DEFLATED 10457 1726773063.24566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/AnsiballZ_dnf.py 10457 1726773063.25333: Sending initial data 10457 1726773063.25348: Sent initial data (150 bytes) 10457 1726773063.27996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpz_ehp6mz /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/AnsiballZ_dnf.py <<< 10457 1726773063.29268: stderr chunk (state=3): >>><<< 10457 1726773063.29276: stdout chunk (state=3): >>><<< 10457 1726773063.29303: done transferring module to remote 10457 1726773063.29321: _low_level_execute_command(): starting 10457 1726773063.29325: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/ /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/AnsiballZ_dnf.py && sleep 0' 10457 1726773063.32096: stderr chunk (state=2): >>><<< 10457 1726773063.32118: stdout chunk (state=2): >>><<< 10457 1726773063.32149: _low_level_execute_command() done: rc=0, stdout=, stderr= 10457 1726773063.32157: _low_level_execute_command(): starting 10457 1726773063.32166: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/AnsiballZ_dnf.py && sleep 0' 10457 1726773067.88307: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10457 1726773067.91354: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10457 1726773067.91398: stderr chunk (state=3): >>><<< 10457 1726773067.91406: stdout chunk (state=3): >>><<< 10457 1726773067.91429: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10457 1726773067.91463: done with _execute_module (dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10457 1726773067.91472: _low_level_execute_command(): starting 10457 1726773067.91476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773063.2101421-10457-78763079155360/ > /dev/null 2>&1 && sleep 0' 10457 1726773067.94206: stderr chunk (state=2): >>><<< 10457 1726773067.94220: stdout chunk (state=2): >>><<< 10457 1726773067.94242: _low_level_execute_command() done: rc=0, stdout=, stderr= 10457 1726773067.94251: handler run complete 10457 1726773067.94288: attempt loop complete, returning result 10457 1726773067.94305: _execute() done 10457 1726773067.94308: dumping result to json 10457 1726773067.94314: done dumping result, returning 10457 1726773067.94329: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-0000000002ea] 10457 1726773067.94341: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ea 10457 1726773067.94381: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ea 10457 1726773067.94422: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773067.94581: no more pending results, returning what we have 8119 1726773067.94589: results queue empty 8119 1726773067.94592: checking for any_errors_fatal 8119 1726773067.94598: done checking for any_errors_fatal 8119 1726773067.94599: checking for max_fail_percentage 8119 1726773067.94602: done checking for max_fail_percentage 8119 1726773067.94604: checking to see if all hosts have failed and the running result is not ok 8119 1726773067.94606: done checking to see if all hosts have failed 8119 1726773067.94608: getting the remaining hosts for this loop 8119 1726773067.94611: done getting the remaining hosts for this loop 8119 1726773067.94619: building list of next tasks for hosts 8119 1726773067.94621: getting the next task for host managed_node2 8119 1726773067.94630: done getting next task for host managed_node2 8119 1726773067.94634: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773067.94638: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773067.94641: done building task lists 8119 1726773067.94643: counting tasks in each state of execution 8119 1726773067.94646: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773067.94649: advancing hosts in ITERATING_TASKS 8119 1726773067.94651: starting to advance hosts 8119 1726773067.94653: getting the next task for host managed_node2 8119 1726773067.94658: done getting next task for host managed_node2 8119 1726773067.94661: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773067.94664: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773067.94666: done advancing hosts to next task 8119 1726773067.94681: Loading ActionModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773067.94689: getting variables 8119 1726773067.94692: in VariableManager get_vars() 8119 1726773067.94723: Calling all_inventory to load vars for managed_node2 8119 1726773067.94728: Calling groups_inventory to load vars for managed_node2 8119 1726773067.94730: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773067.94752: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.94763: Calling all_plugins_play to load vars for managed_node2 8119 1726773067.94773: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.94782: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773067.94796: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.94803: Calling groups_plugins_play to load vars for managed_node2 8119 1726773067.94814: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.94837: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.94851: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.95055: done with get_vars() 8119 1726773067.95066: done getting variables 8119 1726773067.95071: sending task start callback, copying the task so we can template it temporarily 8119 1726773067.95073: done copying, going to template now 8119 1726773067.95075: done templating 8119 1726773067.95076: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:07 -0400 (0:00:04.868) 0:01:02.507 **** 8119 1726773067.95095: sending task start callback 8119 1726773067.95097: entering _queue_task() for managed_node2/debug 8119 1726773067.95221: worker is 1 (out of 1 available) 8119 1726773067.95261: exiting _queue_task() for managed_node2/debug 8119 1726773067.95334: done queuing things up, now waiting for results queue to drain 8119 1726773067.95340: waiting for pending results... 10590 1726773067.95406: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10590 1726773067.95461: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002ec 10590 1726773067.95517: calling self._execute() 10590 1726773067.97300: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10590 1726773067.97388: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10590 1726773067.97494: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10590 1726773067.97527: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10590 1726773067.97553: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10590 1726773067.97589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10590 1726773067.97633: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10590 1726773067.97656: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10590 1726773067.97675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10590 1726773067.97756: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10590 1726773067.97773: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10590 1726773067.97792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10590 1726773067.98069: when evaluation is False, skipping this task 10590 1726773067.98073: _execute() done 10590 1726773067.98075: dumping result to json 10590 1726773067.98077: done dumping result, returning 10590 1726773067.98081: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000002ec] 10590 1726773067.98092: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ec 10590 1726773067.98119: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ec 10590 1726773067.98123: WORKER PROCESS EXITING skipping: [managed_node2] => {} 8119 1726773067.98323: no more pending results, returning what we have 8119 1726773067.98328: results queue empty 8119 1726773067.98330: checking for any_errors_fatal 8119 1726773067.98335: done checking for any_errors_fatal 8119 1726773067.98338: checking for max_fail_percentage 8119 1726773067.98340: done checking for max_fail_percentage 8119 1726773067.98342: checking to see if all hosts have failed and the running result is not ok 8119 1726773067.98344: done checking to see if all hosts have failed 8119 1726773067.98346: getting the remaining hosts for this loop 8119 1726773067.98349: done getting the remaining hosts for this loop 8119 1726773067.98356: building list of next tasks for hosts 8119 1726773067.98359: getting the next task for host managed_node2 8119 1726773067.98365: done getting next task for host managed_node2 8119 1726773067.98370: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773067.98374: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773067.98377: done building task lists 8119 1726773067.98379: counting tasks in each state of execution 8119 1726773067.98384: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773067.98387: advancing hosts in ITERATING_TASKS 8119 1726773067.98390: starting to advance hosts 8119 1726773067.98392: getting the next task for host managed_node2 8119 1726773067.98396: done getting next task for host managed_node2 8119 1726773067.98399: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773067.98401: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773067.98402: done advancing hosts to next task 8119 1726773067.98415: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773067.98419: getting variables 8119 1726773067.98421: in VariableManager get_vars() 8119 1726773067.98447: Calling all_inventory to load vars for managed_node2 8119 1726773067.98451: Calling groups_inventory to load vars for managed_node2 8119 1726773067.98453: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773067.98474: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98486: Calling all_plugins_play to load vars for managed_node2 8119 1726773067.98498: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98511: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773067.98525: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98532: Calling groups_plugins_play to load vars for managed_node2 8119 1726773067.98541: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98559: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98573: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773067.98791: done with get_vars() 8119 1726773067.98802: done getting variables 8119 1726773067.98807: sending task start callback, copying the task so we can template it temporarily 8119 1726773067.98809: done copying, going to template now 8119 1726773067.98812: done templating 8119 1726773067.98813: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:07 -0400 (0:00:00.037) 0:01:02.544 **** 8119 1726773067.98829: sending task start callback 8119 1726773067.98831: entering _queue_task() for managed_node2/reboot 8119 1726773067.98947: worker is 1 (out of 1 available) 8119 1726773067.98987: exiting _queue_task() for managed_node2/reboot 8119 1726773067.99058: done queuing things up, now waiting for results queue to drain 8119 1726773067.99063: waiting for pending results... 10592 1726773067.99130: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10592 1726773067.99180: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002ed 10592 1726773067.99227: calling self._execute() 10592 1726773068.01033: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10592 1726773068.01115: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10592 1726773068.01177: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10592 1726773068.01209: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10592 1726773068.01239: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10592 1726773068.01270: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10592 1726773068.01315: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10592 1726773068.01339: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10592 1726773068.01357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10592 1726773068.01439: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10592 1726773068.01456: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10592 1726773068.01470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10592 1726773068.01740: when evaluation is False, skipping this task 10592 1726773068.01744: _execute() done 10592 1726773068.01746: dumping result to json 10592 1726773068.01748: done dumping result, returning 10592 1726773068.01753: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [12a3200b-1e9d-1dbd-cc52-0000000002ed] 10592 1726773068.01763: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ed 10592 1726773068.01791: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ed 10592 1726773068.01818: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773068.01951: no more pending results, returning what we have 8119 1726773068.01956: results queue empty 8119 1726773068.01958: checking for any_errors_fatal 8119 1726773068.01962: done checking for any_errors_fatal 8119 1726773068.01964: checking for max_fail_percentage 8119 1726773068.01967: done checking for max_fail_percentage 8119 1726773068.01969: checking to see if all hosts have failed and the running result is not ok 8119 1726773068.01971: done checking to see if all hosts have failed 8119 1726773068.01973: getting the remaining hosts for this loop 8119 1726773068.01975: done getting the remaining hosts for this loop 8119 1726773068.01984: building list of next tasks for hosts 8119 1726773068.01988: getting the next task for host managed_node2 8119 1726773068.01995: done getting next task for host managed_node2 8119 1726773068.01999: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773068.02003: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.02006: done building task lists 8119 1726773068.02008: counting tasks in each state of execution 8119 1726773068.02012: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773068.02015: advancing hosts in ITERATING_TASKS 8119 1726773068.02017: starting to advance hosts 8119 1726773068.02019: getting the next task for host managed_node2 8119 1726773068.02023: done getting next task for host managed_node2 8119 1726773068.02026: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773068.02029: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.02031: done advancing hosts to next task 8119 1726773068.02045: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773068.02051: getting variables 8119 1726773068.02054: in VariableManager get_vars() 8119 1726773068.02086: Calling all_inventory to load vars for managed_node2 8119 1726773068.02091: Calling groups_inventory to load vars for managed_node2 8119 1726773068.02093: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773068.02115: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02126: Calling all_plugins_play to load vars for managed_node2 8119 1726773068.02136: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02145: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773068.02155: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02161: Calling groups_plugins_play to load vars for managed_node2 8119 1726773068.02170: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02190: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02205: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.02428: done with get_vars() 8119 1726773068.02439: done getting variables 8119 1726773068.02443: sending task start callback, copying the task so we can template it temporarily 8119 1726773068.02445: done copying, going to template now 8119 1726773068.02447: done templating 8119 1726773068.02448: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.036) 0:01:02.581 **** 8119 1726773068.02464: sending task start callback 8119 1726773068.02466: entering _queue_task() for managed_node2/fail 8119 1726773068.02586: worker is 1 (out of 1 available) 8119 1726773068.02623: exiting _queue_task() for managed_node2/fail 8119 1726773068.02695: done queuing things up, now waiting for results queue to drain 8119 1726773068.02700: waiting for pending results... 10594 1726773068.02760: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10594 1726773068.02818: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002ee 10594 1726773068.02864: calling self._execute() 10594 1726773068.04643: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10594 1726773068.04753: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10594 1726773068.04806: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10594 1726773068.04835: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10594 1726773068.04864: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10594 1726773068.04895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10594 1726773068.04941: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10594 1726773068.04966: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10594 1726773068.04987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10594 1726773068.05076: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10594 1726773068.05096: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10594 1726773068.05114: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10594 1726773068.05374: when evaluation is False, skipping this task 10594 1726773068.05378: _execute() done 10594 1726773068.05380: dumping result to json 10594 1726773068.05382: done dumping result, returning 10594 1726773068.05388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [12a3200b-1e9d-1dbd-cc52-0000000002ee] 10594 1726773068.05396: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ee 10594 1726773068.05422: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ee 10594 1726773068.05426: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773068.05658: no more pending results, returning what we have 8119 1726773068.05662: results queue empty 8119 1726773068.05664: checking for any_errors_fatal 8119 1726773068.05668: done checking for any_errors_fatal 8119 1726773068.05670: checking for max_fail_percentage 8119 1726773068.05673: done checking for max_fail_percentage 8119 1726773068.05675: checking to see if all hosts have failed and the running result is not ok 8119 1726773068.05677: done checking to see if all hosts have failed 8119 1726773068.05678: getting the remaining hosts for this loop 8119 1726773068.05683: done getting the remaining hosts for this loop 8119 1726773068.05692: building list of next tasks for hosts 8119 1726773068.05695: getting the next task for host managed_node2 8119 1726773068.05707: done getting next task for host managed_node2 8119 1726773068.05715: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773068.05719: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.05723: done building task lists 8119 1726773068.05725: counting tasks in each state of execution 8119 1726773068.05728: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773068.05730: advancing hosts in ITERATING_TASKS 8119 1726773068.05733: starting to advance hosts 8119 1726773068.05735: getting the next task for host managed_node2 8119 1726773068.05740: done getting next task for host managed_node2 8119 1726773068.05743: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773068.05746: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.05748: done advancing hosts to next task 8119 1726773068.05793: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773068.05800: getting variables 8119 1726773068.05804: in VariableManager get_vars() 8119 1726773068.05838: Calling all_inventory to load vars for managed_node2 8119 1726773068.05845: Calling groups_inventory to load vars for managed_node2 8119 1726773068.05849: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773068.05877: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.05895: Calling all_plugins_play to load vars for managed_node2 8119 1726773068.05914: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.05930: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773068.05948: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.05959: Calling groups_plugins_play to load vars for managed_node2 8119 1726773068.05975: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.06006: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.06027: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.06332: done with get_vars() 8119 1726773068.06347: done getting variables 8119 1726773068.06354: sending task start callback, copying the task so we can template it temporarily 8119 1726773068.06357: done copying, going to template now 8119 1726773068.06360: done templating 8119 1726773068.06362: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.039) 0:01:02.620 **** 8119 1726773068.06389: sending task start callback 8119 1726773068.06392: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773068.06539: worker is 1 (out of 1 available) 8119 1726773068.06574: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773068.06647: done queuing things up, now waiting for results queue to drain 8119 1726773068.06653: waiting for pending results... 10597 1726773068.06884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10597 1726773068.06950: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f0 10597 1726773068.07006: calling self._execute() 10597 1726773068.08838: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10597 1726773068.08920: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10597 1726773068.08974: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10597 1726773068.09002: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10597 1726773068.09039: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10597 1726773068.09072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10597 1726773068.09135: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10597 1726773068.09166: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10597 1726773068.09187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10597 1726773068.09294: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10597 1726773068.09317: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10597 1726773068.09333: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10597 1726773068.09587: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10597 1726773068.09643: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10597 1726773068.09657: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10597 1726773068.09672: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10597 1726773068.09680: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10597 1726773068.09785: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10597 1726773068.09804: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10597 1726773068.09835: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10597 1726773068.09855: starting attempt loop 10597 1726773068.09858: running the handler 10597 1726773068.09869: _low_level_execute_command(): starting 10597 1726773068.09874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10597 1726773068.12575: stdout chunk (state=2): >>>/root <<< 10597 1726773068.12708: stderr chunk (state=3): >>><<< 10597 1726773068.12716: stdout chunk (state=3): >>><<< 10597 1726773068.12743: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10597 1726773068.12762: _low_level_execute_command(): starting 10597 1726773068.12773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557 `" && echo ansible-tmp-1726773068.127548-10597-7744031209557="` echo /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557 `" ) && sleep 0' 10597 1726773068.15831: stdout chunk (state=2): >>>ansible-tmp-1726773068.127548-10597-7744031209557=/root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557 <<< 10597 1726773068.15987: stderr chunk (state=3): >>><<< 10597 1726773068.15993: stdout chunk (state=3): >>><<< 10597 1726773068.16013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.127548-10597-7744031209557=/root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557 , stderr= 10597 1726773068.16117: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 10597 1726773068.16185: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/AnsiballZ_kernel_settings_get_config.py 10597 1726773068.16928: Sending initial data 10597 1726773068.16949: Sent initial data (171 bytes) 10597 1726773068.19326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpfwc57ffa /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/AnsiballZ_kernel_settings_get_config.py <<< 10597 1726773068.20515: stderr chunk (state=3): >>><<< 10597 1726773068.20523: stdout chunk (state=3): >>><<< 10597 1726773068.20551: done transferring module to remote 10597 1726773068.20569: _low_level_execute_command(): starting 10597 1726773068.20576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/ /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10597 1726773068.23394: stderr chunk (state=2): >>><<< 10597 1726773068.23406: stdout chunk (state=2): >>><<< 10597 1726773068.23435: _low_level_execute_command() done: rc=0, stdout=, stderr= 10597 1726773068.23440: _low_level_execute_command(): starting 10597 1726773068.23447: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10597 1726773068.38718: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10597 1726773068.39802: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10597 1726773068.39821: stderr chunk (state=3): >>><<< 10597 1726773068.39827: stdout chunk (state=3): >>><<< 10597 1726773068.39846: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 10597 1726773068.39922: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10597 1726773068.39940: _low_level_execute_command(): starting 10597 1726773068.39948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.127548-10597-7744031209557/ > /dev/null 2>&1 && sleep 0' 10597 1726773068.43296: stderr chunk (state=2): >>><<< 10597 1726773068.43309: stdout chunk (state=2): >>><<< 10597 1726773068.43331: _low_level_execute_command() done: rc=0, stdout=, stderr= 10597 1726773068.43337: handler run complete 10597 1726773068.43364: attempt loop complete, returning result 10597 1726773068.43376: _execute() done 10597 1726773068.43378: dumping result to json 10597 1726773068.43381: done dumping result, returning 10597 1726773068.43399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [12a3200b-1e9d-1dbd-cc52-0000000002f0] 10597 1726773068.43418: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f0 10597 1726773068.43692: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f0 10597 1726773068.43698: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8119 1726773068.43951: no more pending results, returning what we have 8119 1726773068.43958: results queue empty 8119 1726773068.43961: checking for any_errors_fatal 8119 1726773068.43967: done checking for any_errors_fatal 8119 1726773068.43970: checking for max_fail_percentage 8119 1726773068.43974: done checking for max_fail_percentage 8119 1726773068.43976: checking to see if all hosts have failed and the running result is not ok 8119 1726773068.43979: done checking to see if all hosts have failed 8119 1726773068.43981: getting the remaining hosts for this loop 8119 1726773068.43987: done getting the remaining hosts for this loop 8119 1726773068.43996: building list of next tasks for hosts 8119 1726773068.44000: getting the next task for host managed_node2 8119 1726773068.44009: done getting next task for host managed_node2 8119 1726773068.44014: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773068.44018: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.44021: done building task lists 8119 1726773068.44023: counting tasks in each state of execution 8119 1726773068.44028: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773068.44031: advancing hosts in ITERATING_TASKS 8119 1726773068.44034: starting to advance hosts 8119 1726773068.44036: getting the next task for host managed_node2 8119 1726773068.44041: done getting next task for host managed_node2 8119 1726773068.44044: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773068.44048: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773068.44050: done advancing hosts to next task 8119 1726773068.44066: getting variables 8119 1726773068.44070: in VariableManager get_vars() 8119 1726773068.44113: Calling all_inventory to load vars for managed_node2 8119 1726773068.44120: Calling groups_inventory to load vars for managed_node2 8119 1726773068.44125: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773068.44155: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44171: Calling all_plugins_play to load vars for managed_node2 8119 1726773068.44191: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44206: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773068.44224: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44235: Calling groups_plugins_play to load vars for managed_node2 8119 1726773068.44252: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44281: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44308: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773068.44636: done with get_vars() 8119 1726773068.44649: done getting variables 8119 1726773068.44655: sending task start callback, copying the task so we can template it temporarily 8119 1726773068.44658: done copying, going to template now 8119 1726773068.44660: done templating 8119 1726773068.44662: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.382) 0:01:03.003 **** 8119 1726773068.44686: sending task start callback 8119 1726773068.44689: entering _queue_task() for managed_node2/stat 8119 1726773068.44868: worker is 1 (out of 1 available) 8119 1726773068.44907: exiting _queue_task() for managed_node2/stat 8119 1726773068.44977: done queuing things up, now waiting for results queue to drain 8119 1726773068.44982: waiting for pending results... 10615 1726773068.45296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10615 1726773068.45365: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f1 10615 1726773068.47479: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10615 1726773068.47573: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10615 1726773068.47631: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10615 1726773068.47660: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10615 1726773068.47687: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10615 1726773068.47718: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10615 1726773068.47766: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10615 1726773068.47792: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10615 1726773068.47823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10615 1726773068.47909: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10615 1726773068.47930: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10615 1726773068.47944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10615 1726773068.48332: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10615 1726773068.48337: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10615 1726773068.48339: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10615 1726773068.48342: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10615 1726773068.48345: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10615 1726773068.48349: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.48352: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10615 1726773068.48355: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10615 1726773068.48357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10615 1726773068.48378: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10615 1726773068.48385: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10615 1726773068.48389: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.48744: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10615 1726773068.48750: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10615 1726773068.48753: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10615 1726773068.48755: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10615 1726773068.48758: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10615 1726773068.48760: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.48763: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10615 1726773068.48765: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10615 1726773068.48767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10615 1726773068.48803: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10615 1726773068.48808: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10615 1726773068.48813: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.49039: when evaluation is False, skipping this task 10615 1726773068.49094: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10615 1726773068.49101: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10615 1726773068.49105: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10615 1726773068.49108: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10615 1726773068.49114: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10615 1726773068.49118: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.49121: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10615 1726773068.49123: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10615 1726773068.49126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10615 1726773068.49155: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10615 1726773068.49160: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10615 1726773068.49163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.49365: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10615 1726773068.49371: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10615 1726773068.49374: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10615 1726773068.49377: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10615 1726773068.49380: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10615 1726773068.49385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.49389: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10615 1726773068.49392: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10615 1726773068.49394: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10615 1726773068.49426: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10615 1726773068.49431: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10615 1726773068.49434: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.49752: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10615 1726773068.49834: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10615 1726773068.49845: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10615 1726773068.49857: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10615 1726773068.49864: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "item": "", "skip_reason": "Conditional result was False" } 10615 1726773068.49981: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10615 1726773068.49999: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10615 1726773068.50027: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10615 1726773068.50035: starting attempt loop 10615 1726773068.50037: running the handler 10615 1726773068.50044: _low_level_execute_command(): starting 10615 1726773068.50048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10615 1726773068.52889: stdout chunk (state=2): >>>/root <<< 10615 1726773068.52933: stderr chunk (state=3): >>><<< 10615 1726773068.52941: stdout chunk (state=3): >>><<< 10615 1726773068.52968: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10615 1726773068.52985: _low_level_execute_command(): starting 10615 1726773068.52992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550 `" && echo ansible-tmp-1726773068.5297735-10615-50456959890550="` echo /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550 `" ) && sleep 0' 10615 1726773068.55722: stdout chunk (state=2): >>>ansible-tmp-1726773068.5297735-10615-50456959890550=/root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550 <<< 10615 1726773068.56252: stderr chunk (state=3): >>><<< 10615 1726773068.56261: stdout chunk (state=3): >>><<< 10615 1726773068.56289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.5297735-10615-50456959890550=/root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550 , stderr= 10615 1726773068.56415: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10615 1726773068.56491: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/AnsiballZ_stat.py 10615 1726773068.56922: Sending initial data 10615 1726773068.56936: Sent initial data (151 bytes) 10615 1726773068.59536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpx6qtrcdd /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/AnsiballZ_stat.py <<< 10615 1726773068.61020: stderr chunk (state=3): >>><<< 10615 1726773068.61026: stdout chunk (state=3): >>><<< 10615 1726773068.61048: done transferring module to remote 10615 1726773068.61060: _low_level_execute_command(): starting 10615 1726773068.61064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/ /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/AnsiballZ_stat.py && sleep 0' 10615 1726773068.63608: stderr chunk (state=2): >>><<< 10615 1726773068.63621: stdout chunk (state=2): >>><<< 10615 1726773068.63640: _low_level_execute_command() done: rc=0, stdout=, stderr= 10615 1726773068.63643: _low_level_execute_command(): starting 10615 1726773068.63649: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/AnsiballZ_stat.py && sleep 0' 10615 1726773068.78079: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10615 1726773068.79008: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10615 1726773068.79064: stderr chunk (state=3): >>><<< 10615 1726773068.79070: stdout chunk (state=3): >>><<< 10615 1726773068.79093: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 10615 1726773068.79121: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10615 1726773068.79132: _low_level_execute_command(): starting 10615 1726773068.79138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.5297735-10615-50456959890550/ > /dev/null 2>&1 && sleep 0' 10615 1726773068.81785: stderr chunk (state=2): >>><<< 10615 1726773068.81797: stdout chunk (state=2): >>><<< 10615 1726773068.81817: _low_level_execute_command() done: rc=0, stdout=, stderr= 10615 1726773068.81824: handler run complete 10615 1726773068.81854: attempt loop complete, returning result 10615 1726773068.82203: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10615 1726773068.82214: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10615 1726773068.82220: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10615 1726773068.82224: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10615 1726773068.82227: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10615 1726773068.82230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.82233: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10615 1726773068.82236: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10615 1726773068.82239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10615 1726773068.82274: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10615 1726773068.82278: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10615 1726773068.82281: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10615 1726773068.82554: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10615 1726773068.82561: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10615 1726773068.82565: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10615 1726773068.82643: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10615 1726773068.82656: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10615 1726773068.82661: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10615 1726773068.82667: starting attempt loop 10615 1726773068.82669: running the handler 10615 1726773068.82674: _low_level_execute_command(): starting 10615 1726773068.82677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10615 1726773068.85031: stdout chunk (state=2): >>>/root <<< 10615 1726773068.85148: stderr chunk (state=3): >>><<< 10615 1726773068.85152: stdout chunk (state=3): >>><<< 10615 1726773068.85170: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10615 1726773068.85187: _low_level_execute_command(): starting 10615 1726773068.85195: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158 `" && echo ansible-tmp-1726773068.8517928-10615-106713290813158="` echo /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158 `" ) && sleep 0' 10615 1726773068.87837: stdout chunk (state=2): >>>ansible-tmp-1726773068.8517928-10615-106713290813158=/root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158 <<< 10615 1726773068.87963: stderr chunk (state=3): >>><<< 10615 1726773068.87970: stdout chunk (state=3): >>><<< 10615 1726773068.87989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.8517928-10615-106713290813158=/root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158 , stderr= 10615 1726773068.88070: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10615 1726773068.88127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/AnsiballZ_stat.py 10615 1726773068.88412: Sending initial data 10615 1726773068.88426: Sent initial data (152 bytes) 10615 1726773068.90841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp57dyzon7 /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/AnsiballZ_stat.py <<< 10615 1726773068.91856: stderr chunk (state=3): >>><<< 10615 1726773068.91862: stdout chunk (state=3): >>><<< 10615 1726773068.91886: done transferring module to remote 10615 1726773068.91897: _low_level_execute_command(): starting 10615 1726773068.91901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/ /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/AnsiballZ_stat.py && sleep 0' 10615 1726773068.94448: stderr chunk (state=2): >>><<< 10615 1726773068.94460: stdout chunk (state=2): >>><<< 10615 1726773068.94479: _low_level_execute_command() done: rc=0, stdout=, stderr= 10615 1726773068.94484: _low_level_execute_command(): starting 10615 1726773068.94491: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/AnsiballZ_stat.py && sleep 0' 10615 1726773069.10644: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10615 1726773069.11715: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10615 1726773069.11758: stderr chunk (state=3): >>><<< 10615 1726773069.11762: stdout chunk (state=3): >>><<< 10615 1726773069.11780: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 10615 1726773069.11842: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10615 1726773069.11854: _low_level_execute_command(): starting 10615 1726773069.11859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.8517928-10615-106713290813158/ > /dev/null 2>&1 && sleep 0' 10615 1726773069.15104: stderr chunk (state=2): >>><<< 10615 1726773069.15123: stdout chunk (state=2): >>><<< 10615 1726773069.15152: _low_level_execute_command() done: rc=0, stdout=, stderr= 10615 1726773069.15163: handler run complete 10615 1726773069.15233: attempt loop complete, returning result 10615 1726773069.15466: dumping result to json 10615 1726773069.15479: done dumping result, returning 10615 1726773069.15496: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [12a3200b-1e9d-1dbd-cc52-0000000002f1] 10615 1726773069.15508: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f1 10615 1726773069.15515: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f1 10615 1726773069.15518: WORKER PROCESS EXITING ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773035.2883239, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773033.0853279, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773033.0853279, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8119 1726773069.15856: no more pending results, returning what we have 8119 1726773069.15864: results queue empty 8119 1726773069.15867: checking for any_errors_fatal 8119 1726773069.15873: done checking for any_errors_fatal 8119 1726773069.15875: checking for max_fail_percentage 8119 1726773069.15879: done checking for max_fail_percentage 8119 1726773069.15881: checking to see if all hosts have failed and the running result is not ok 8119 1726773069.15886: done checking to see if all hosts have failed 8119 1726773069.15888: getting the remaining hosts for this loop 8119 1726773069.15892: done getting the remaining hosts for this loop 8119 1726773069.15899: building list of next tasks for hosts 8119 1726773069.15903: getting the next task for host managed_node2 8119 1726773069.15916: done getting next task for host managed_node2 8119 1726773069.15922: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773069.15926: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.15929: done building task lists 8119 1726773069.15931: counting tasks in each state of execution 8119 1726773069.15937: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773069.15939: advancing hosts in ITERATING_TASKS 8119 1726773069.15941: starting to advance hosts 8119 1726773069.15944: getting the next task for host managed_node2 8119 1726773069.15948: done getting next task for host managed_node2 8119 1726773069.15952: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773069.15955: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.15957: done advancing hosts to next task 8119 1726773069.15974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773069.15979: getting variables 8119 1726773069.15984: in VariableManager get_vars() 8119 1726773069.16026: Calling all_inventory to load vars for managed_node2 8119 1726773069.16032: Calling groups_inventory to load vars for managed_node2 8119 1726773069.16035: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773069.16062: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16075: Calling all_plugins_play to load vars for managed_node2 8119 1726773069.16092: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16105: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773069.16122: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16131: Calling groups_plugins_play to load vars for managed_node2 8119 1726773069.16144: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16169: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16190: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.16526: done with get_vars() 8119 1726773069.16540: done getting variables 8119 1726773069.16547: sending task start callback, copying the task so we can template it temporarily 8119 1726773069.16549: done copying, going to template now 8119 1726773069.16552: done templating 8119 1726773069.16554: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.718) 0:01:03.722 **** 8119 1726773069.16576: sending task start callback 8119 1726773069.16579: entering _queue_task() for managed_node2/set_fact 8119 1726773069.16937: worker is 1 (out of 1 available) 8119 1726773069.16973: exiting _queue_task() for managed_node2/set_fact 8119 1726773069.17050: done queuing things up, now waiting for results queue to drain 8119 1726773069.17055: waiting for pending results... 10643 1726773069.17278: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10643 1726773069.17348: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f2 10643 1726773069.17405: calling self._execute() 10643 1726773069.19906: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10643 1726773069.20025: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10643 1726773069.20089: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10643 1726773069.20136: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10643 1726773069.20181: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10643 1726773069.20230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10643 1726773069.20314: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10643 1726773069.20350: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10643 1726773069.20377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10643 1726773069.20516: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10643 1726773069.20542: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10643 1726773069.20565: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10643 1726773069.21131: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10643 1726773069.21177: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10643 1726773069.21191: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10643 1726773069.21205: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10643 1726773069.21213: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10643 1726773069.21330: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10643 1726773069.21350: starting attempt loop 10643 1726773069.21352: running the handler 10643 1726773069.21367: handler run complete 10643 1726773069.21371: attempt loop complete, returning result 10643 1726773069.21374: _execute() done 10643 1726773069.21376: dumping result to json 10643 1726773069.21379: done dumping result, returning 10643 1726773069.21398: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [12a3200b-1e9d-1dbd-cc52-0000000002f2] 10643 1726773069.21408: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f2 10643 1726773069.21435: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f2 10643 1726773069.21438: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8119 1726773069.21806: no more pending results, returning what we have 8119 1726773069.21813: results queue empty 8119 1726773069.21816: checking for any_errors_fatal 8119 1726773069.21824: done checking for any_errors_fatal 8119 1726773069.21826: checking for max_fail_percentage 8119 1726773069.21829: done checking for max_fail_percentage 8119 1726773069.21831: checking to see if all hosts have failed and the running result is not ok 8119 1726773069.21833: done checking to see if all hosts have failed 8119 1726773069.21834: getting the remaining hosts for this loop 8119 1726773069.21837: done getting the remaining hosts for this loop 8119 1726773069.21844: building list of next tasks for hosts 8119 1726773069.21847: getting the next task for host managed_node2 8119 1726773069.21854: done getting next task for host managed_node2 8119 1726773069.21857: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773069.21861: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.21864: done building task lists 8119 1726773069.21865: counting tasks in each state of execution 8119 1726773069.21869: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773069.21871: advancing hosts in ITERATING_TASKS 8119 1726773069.21873: starting to advance hosts 8119 1726773069.21875: getting the next task for host managed_node2 8119 1726773069.21879: done getting next task for host managed_node2 8119 1726773069.21881: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773069.21886: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.21888: done advancing hosts to next task 8119 1726773069.21901: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773069.21906: getting variables 8119 1726773069.21908: in VariableManager get_vars() 8119 1726773069.21948: Calling all_inventory to load vars for managed_node2 8119 1726773069.21955: Calling groups_inventory to load vars for managed_node2 8119 1726773069.21959: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773069.21991: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22009: Calling all_plugins_play to load vars for managed_node2 8119 1726773069.22031: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22047: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773069.22065: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22077: Calling groups_plugins_play to load vars for managed_node2 8119 1726773069.22095: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22130: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22157: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.22513: done with get_vars() 8119 1726773069.22528: done getting variables 8119 1726773069.22536: sending task start callback, copying the task so we can template it temporarily 8119 1726773069.22539: done copying, going to template now 8119 1726773069.22542: done templating 8119 1726773069.22544: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.059) 0:01:03.782 **** 8119 1726773069.22568: sending task start callback 8119 1726773069.22571: entering _queue_task() for managed_node2/service 8119 1726773069.22724: worker is 1 (out of 1 available) 8119 1726773069.22762: exiting _queue_task() for managed_node2/service 8119 1726773069.22840: done queuing things up, now waiting for results queue to drain 8119 1726773069.22845: waiting for pending results... 10647 1726773069.23070: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10647 1726773069.23141: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f3 10647 1726773069.25572: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10647 1726773069.25689: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10647 1726773069.25760: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10647 1726773069.25801: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10647 1726773069.25860: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10647 1726773069.25903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10647 1726773069.25967: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10647 1726773069.26000: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10647 1726773069.26029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10647 1726773069.26156: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10647 1726773069.26184: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10647 1726773069.26213: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10647 1726773069.26415: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10647 1726773069.26421: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10647 1726773069.26425: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10647 1726773069.26428: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10647 1726773069.26432: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10647 1726773069.26435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.26438: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10647 1726773069.26441: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10647 1726773069.26444: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10647 1726773069.26468: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10647 1726773069.26472: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10647 1726773069.26476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.26713: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10647 1726773069.26720: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10647 1726773069.26723: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10647 1726773069.26726: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10647 1726773069.26730: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10647 1726773069.26733: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.26736: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10647 1726773069.26739: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10647 1726773069.26741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10647 1726773069.26769: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10647 1726773069.26773: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10647 1726773069.26777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.26904: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10647 1726773069.26952: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10647 1726773069.26964: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10647 1726773069.26981: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10647 1726773069.26992: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10647 1726773069.27136: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10647 1726773069.27148: starting attempt loop 10647 1726773069.27151: running the handler 10647 1726773069.27352: _low_level_execute_command(): starting 10647 1726773069.27360: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10647 1726773069.30186: stdout chunk (state=2): >>>/root <<< 10647 1726773069.30287: stderr chunk (state=3): >>><<< 10647 1726773069.30293: stdout chunk (state=3): >>><<< 10647 1726773069.30314: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10647 1726773069.30330: _low_level_execute_command(): starting 10647 1726773069.30337: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042 `" && echo ansible-tmp-1726773069.3032231-10647-251375812549042="` echo /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042 `" ) && sleep 0' 10647 1726773069.33030: stdout chunk (state=2): >>>ansible-tmp-1726773069.3032231-10647-251375812549042=/root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042 <<< 10647 1726773069.33147: stderr chunk (state=3): >>><<< 10647 1726773069.33154: stdout chunk (state=3): >>><<< 10647 1726773069.33175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773069.3032231-10647-251375812549042=/root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042 , stderr= 10647 1726773069.33289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 10647 1726773069.33379: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/AnsiballZ_systemd.py 10647 1726773069.33702: Sending initial data 10647 1726773069.33719: Sent initial data (155 bytes) 10647 1726773069.36321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptdy4453t /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/AnsiballZ_systemd.py <<< 10647 1726773069.38680: stderr chunk (state=3): >>><<< 10647 1726773069.38691: stdout chunk (state=3): >>><<< 10647 1726773069.38725: done transferring module to remote 10647 1726773069.38743: _low_level_execute_command(): starting 10647 1726773069.38749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/ /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/AnsiballZ_systemd.py && sleep 0' 10647 1726773069.41988: stderr chunk (state=2): >>><<< 10647 1726773069.42004: stdout chunk (state=2): >>><<< 10647 1726773069.42031: _low_level_execute_command() done: rc=0, stdout=, stderr= 10647 1726773069.42037: _low_level_execute_command(): starting 10647 1726773069.42045: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/AnsiballZ_systemd.py && sleep 0' 10647 1726773069.68456: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18415616", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 10647 1726773069.68490: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChange<<< 10647 1726773069.68498: stdout chunk (state=3): >>>TimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 10647 1726773069.69885: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10647 1726773069.69934: stderr chunk (state=3): >>><<< 10647 1726773069.69939: stdout chunk (state=3): >>><<< 10647 1726773069.69959: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18415616", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10647 1726773069.70067: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10647 1726773069.70085: _low_level_execute_command(): starting 10647 1726773069.70093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773069.3032231-10647-251375812549042/ > /dev/null 2>&1 && sleep 0' 10647 1726773069.72728: stderr chunk (state=2): >>><<< 10647 1726773069.72739: stdout chunk (state=2): >>><<< 10647 1726773069.72759: _low_level_execute_command() done: rc=0, stdout=, stderr= 10647 1726773069.72769: handler run complete 10647 1726773069.72777: attempt loop complete, returning result 10647 1726773069.72847: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10647 1726773069.72853: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10647 1726773069.72855: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10647 1726773069.72858: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10647 1726773069.72860: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10647 1726773069.72862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.72865: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10647 1726773069.72867: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10647 1726773069.72869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10647 1726773069.72907: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10647 1726773069.72912: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10647 1726773069.72914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10647 1726773069.73063: dumping result to json 10647 1726773069.73179: done dumping result, returning 10647 1726773069.73201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-0000000002f3] 10647 1726773069.73212: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f3 10647 1726773069.73217: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f3 10647 1726773069.73220: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "658", "MemoryAccounting": "yes", "MemoryCurrent": "18415616", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "WatchdogUSec": "0" } } 8119 1726773069.73652: no more pending results, returning what we have 8119 1726773069.73658: results queue empty 8119 1726773069.73660: checking for any_errors_fatal 8119 1726773069.73663: done checking for any_errors_fatal 8119 1726773069.73664: checking for max_fail_percentage 8119 1726773069.73667: done checking for max_fail_percentage 8119 1726773069.73668: checking to see if all hosts have failed and the running result is not ok 8119 1726773069.73669: done checking to see if all hosts have failed 8119 1726773069.73671: getting the remaining hosts for this loop 8119 1726773069.73672: done getting the remaining hosts for this loop 8119 1726773069.73677: building list of next tasks for hosts 8119 1726773069.73679: getting the next task for host managed_node2 8119 1726773069.73686: done getting next task for host managed_node2 8119 1726773069.73691: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773069.73695: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.73697: done building task lists 8119 1726773069.73698: counting tasks in each state of execution 8119 1726773069.73701: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773069.73703: advancing hosts in ITERATING_TASKS 8119 1726773069.73704: starting to advance hosts 8119 1726773069.73705: getting the next task for host managed_node2 8119 1726773069.73708: done getting next task for host managed_node2 8119 1726773069.73710: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773069.73713: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773069.73714: done advancing hosts to next task 8119 1726773069.73725: getting variables 8119 1726773069.73728: in VariableManager get_vars() 8119 1726773069.73753: Calling all_inventory to load vars for managed_node2 8119 1726773069.73756: Calling groups_inventory to load vars for managed_node2 8119 1726773069.73759: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773069.73779: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.73792: Calling all_plugins_play to load vars for managed_node2 8119 1726773069.73807: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.73820: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773069.73832: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.73839: Calling groups_plugins_play to load vars for managed_node2 8119 1726773069.73848: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.73866: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.73879: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773069.74086: done with get_vars() 8119 1726773069.74096: done getting variables 8119 1726773069.74101: sending task start callback, copying the task so we can template it temporarily 8119 1726773069.74102: done copying, going to template now 8119 1726773069.74104: done templating 8119 1726773069.74106: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.515) 0:01:04.297 **** 8119 1726773069.74122: sending task start callback 8119 1726773069.74123: entering _queue_task() for managed_node2/file 8119 1726773069.74244: worker is 1 (out of 1 available) 8119 1726773069.74285: exiting _queue_task() for managed_node2/file 8119 1726773069.74356: done queuing things up, now waiting for results queue to drain 8119 1726773069.74361: waiting for pending results... 10667 1726773069.74430: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10667 1726773069.74479: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f4 10667 1726773069.74528: calling self._execute() 10667 1726773069.76301: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10667 1726773069.76382: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10667 1726773069.76441: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10667 1726773069.76477: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10667 1726773069.76509: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10667 1726773069.76545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10667 1726773069.76599: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10667 1726773069.76625: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10667 1726773069.76642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10667 1726773069.76734: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10667 1726773069.76753: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10667 1726773069.76767: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10667 1726773069.76993: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10667 1726773069.77032: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10667 1726773069.77044: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10667 1726773069.77054: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10667 1726773069.77060: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10667 1726773069.77160: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10667 1726773069.77179: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10667 1726773069.77213: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10667 1726773069.77232: starting attempt loop 10667 1726773069.77235: running the handler 10667 1726773069.77245: _low_level_execute_command(): starting 10667 1726773069.77251: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10667 1726773069.79997: stdout chunk (state=2): >>>/root <<< 10667 1726773069.80099: stderr chunk (state=3): >>><<< 10667 1726773069.80104: stdout chunk (state=3): >>><<< 10667 1726773069.80126: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10667 1726773069.80144: _low_level_execute_command(): starting 10667 1726773069.80154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569 `" && echo ansible-tmp-1726773069.801347-10667-55482644914569="` echo /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569 `" ) && sleep 0' 10667 1726773069.83656: stdout chunk (state=2): >>>ansible-tmp-1726773069.801347-10667-55482644914569=/root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569 <<< 10667 1726773069.83673: stderr chunk (state=2): >>><<< 10667 1726773069.83691: stdout chunk (state=3): >>><<< 10667 1726773069.83712: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773069.801347-10667-55482644914569=/root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569 , stderr= 10667 1726773069.83827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 10667 1726773069.83903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/AnsiballZ_file.py 10667 1726773069.85126: Sending initial data 10667 1726773069.85140: Sent initial data (150 bytes) 10667 1726773069.88794: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmph40gjty8 /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/AnsiballZ_file.py <<< 10667 1726773069.89197: stderr chunk (state=3): >>><<< 10667 1726773069.89205: stdout chunk (state=3): >>><<< 10667 1726773069.89234: done transferring module to remote 10667 1726773069.89252: _low_level_execute_command(): starting 10667 1726773069.89257: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/ /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/AnsiballZ_file.py && sleep 0' 10667 1726773069.92395: stderr chunk (state=2): >>><<< 10667 1726773069.92415: stdout chunk (state=2): >>><<< 10667 1726773069.92441: _low_level_execute_command() done: rc=0, stdout=, stderr= 10667 1726773069.92447: _low_level_execute_command(): starting 10667 1726773069.92456: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/AnsiballZ_file.py && sleep 0' 10667 1726773070.08996: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 10667 1726773070.10295: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10667 1726773070.10345: stderr chunk (state=3): >>><<< 10667 1726773070.10352: stdout chunk (state=3): >>><<< 10667 1726773070.10376: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10667 1726773070.10423: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10667 1726773070.10439: _low_level_execute_command(): starting 10667 1726773070.10445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773069.801347-10667-55482644914569/ > /dev/null 2>&1 && sleep 0' 10667 1726773070.13480: stderr chunk (state=2): >>><<< 10667 1726773070.13500: stdout chunk (state=2): >>><<< 10667 1726773070.13530: _low_level_execute_command() done: rc=0, stdout=, stderr= 10667 1726773070.13540: handler run complete 10667 1726773070.13547: attempt loop complete, returning result 10667 1726773070.13564: _execute() done 10667 1726773070.13568: dumping result to json 10667 1726773070.13575: done dumping result, returning 10667 1726773070.13595: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-0000000002f4] 10667 1726773070.13617: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f4 10667 1726773070.13680: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f4 10667 1726773070.13687: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8119 1726773070.14363: no more pending results, returning what we have 8119 1726773070.14369: results queue empty 8119 1726773070.14372: checking for any_errors_fatal 8119 1726773070.14382: done checking for any_errors_fatal 8119 1726773070.14387: checking for max_fail_percentage 8119 1726773070.14391: done checking for max_fail_percentage 8119 1726773070.14393: checking to see if all hosts have failed and the running result is not ok 8119 1726773070.14395: done checking to see if all hosts have failed 8119 1726773070.14398: getting the remaining hosts for this loop 8119 1726773070.14401: done getting the remaining hosts for this loop 8119 1726773070.14409: building list of next tasks for hosts 8119 1726773070.14416: getting the next task for host managed_node2 8119 1726773070.14424: done getting next task for host managed_node2 8119 1726773070.14429: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773070.14433: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.14436: done building task lists 8119 1726773070.14438: counting tasks in each state of execution 8119 1726773070.14443: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773070.14445: advancing hosts in ITERATING_TASKS 8119 1726773070.14448: starting to advance hosts 8119 1726773070.14450: getting the next task for host managed_node2 8119 1726773070.14455: done getting next task for host managed_node2 8119 1726773070.14458: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773070.14461: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.14464: done advancing hosts to next task 8119 1726773070.14479: getting variables 8119 1726773070.14485: in VariableManager get_vars() 8119 1726773070.14528: Calling all_inventory to load vars for managed_node2 8119 1726773070.14534: Calling groups_inventory to load vars for managed_node2 8119 1726773070.14539: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773070.14569: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.14589: Calling all_plugins_play to load vars for managed_node2 8119 1726773070.14609: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.14628: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773070.14646: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.14656: Calling groups_plugins_play to load vars for managed_node2 8119 1726773070.14674: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.14708: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.14737: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.15080: done with get_vars() 8119 1726773070.15096: done getting variables 8119 1726773070.15103: sending task start callback, copying the task so we can template it temporarily 8119 1726773070.15105: done copying, going to template now 8119 1726773070.15107: done templating 8119 1726773070.15109: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.410) 0:01:04.707 **** 8119 1726773070.15133: sending task start callback 8119 1726773070.15135: entering _queue_task() for managed_node2/slurp 10698 1726773070.15576: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10698 1726773070.15651: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f5 10698 1726773070.15730: calling self._execute() 8119 1726773070.16390: worker is 1 (out of 1 available) 8119 1726773070.16409: exiting _queue_task() for managed_node2/slurp 8119 1726773070.16447: done queuing things up, now waiting for results queue to drain 8119 1726773070.16450: waiting for pending results... 10698 1726773070.18065: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10698 1726773070.18191: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10698 1726773070.18251: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10698 1726773070.18286: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10698 1726773070.18323: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10698 1726773070.18358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10698 1726773070.18413: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10698 1726773070.18441: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10698 1726773070.18461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10698 1726773070.18566: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10698 1726773070.18593: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10698 1726773070.18615: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10698 1726773070.18916: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10698 1726773070.18967: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10698 1726773070.19017: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10698 1726773070.19037: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10698 1726773070.19047: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10698 1726773070.19180: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10698 1726773070.19205: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10698 1726773070.19240: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10698 1726773070.19251: starting attempt loop 10698 1726773070.19254: running the handler 10698 1726773070.19264: _low_level_execute_command(): starting 10698 1726773070.19270: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10698 1726773070.22110: stdout chunk (state=2): >>>/root <<< 10698 1726773070.22376: stderr chunk (state=3): >>><<< 10698 1726773070.22386: stdout chunk (state=3): >>><<< 10698 1726773070.22415: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10698 1726773070.22435: _low_level_execute_command(): starting 10698 1726773070.22443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709 `" && echo ansible-tmp-1726773070.2242782-10698-34939779923709="` echo /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709 `" ) && sleep 0' 10698 1726773070.26103: stdout chunk (state=2): >>>ansible-tmp-1726773070.2242782-10698-34939779923709=/root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709 <<< 10698 1726773070.27264: stderr chunk (state=3): >>><<< 10698 1726773070.27274: stdout chunk (state=3): >>><<< 10698 1726773070.27302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773070.2242782-10698-34939779923709=/root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709 , stderr= 10698 1726773070.27410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/slurp-ZIP_DEFLATED 10698 1726773070.27486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/AnsiballZ_slurp.py 10698 1726773070.28242: Sending initial data 10698 1726773070.28255: Sent initial data (152 bytes) 10698 1726773070.30905: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpoc78cm8e /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/AnsiballZ_slurp.py <<< 10698 1726773070.32305: stderr chunk (state=3): >>><<< 10698 1726773070.32313: stdout chunk (state=3): >>><<< 10698 1726773070.32343: done transferring module to remote 10698 1726773070.32362: _low_level_execute_command(): starting 10698 1726773070.32368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/ /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/AnsiballZ_slurp.py && sleep 0' 10698 1726773070.35747: stderr chunk (state=2): >>><<< 10698 1726773070.35758: stdout chunk (state=2): >>><<< 10698 1726773070.35777: _low_level_execute_command() done: rc=0, stdout=, stderr= 10698 1726773070.35780: _low_level_execute_command(): starting 10698 1726773070.35789: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/AnsiballZ_slurp.py && sleep 0' 10698 1726773070.50320: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10698 1726773070.51259: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10698 1726773070.51303: stderr chunk (state=3): >>><<< 10698 1726773070.51309: stdout chunk (state=3): >>><<< 10698 1726773070.51330: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 10698 1726773070.51354: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10698 1726773070.51365: _low_level_execute_command(): starting 10698 1726773070.51370: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773070.2242782-10698-34939779923709/ > /dev/null 2>&1 && sleep 0' 10698 1726773070.53992: stderr chunk (state=2): >>><<< 10698 1726773070.54002: stdout chunk (state=2): >>><<< 10698 1726773070.54020: _low_level_execute_command() done: rc=0, stdout=, stderr= 10698 1726773070.54026: handler run complete 10698 1726773070.54051: attempt loop complete, returning result 10698 1726773070.54066: _execute() done 10698 1726773070.54071: dumping result to json 10698 1726773070.54075: done dumping result, returning 10698 1726773070.54090: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [12a3200b-1e9d-1dbd-cc52-0000000002f5] 10698 1726773070.54104: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f5 10698 1726773070.54140: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f5 10698 1726773070.54144: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773070.54379: no more pending results, returning what we have 8119 1726773070.54385: results queue empty 8119 1726773070.54388: checking for any_errors_fatal 8119 1726773070.54394: done checking for any_errors_fatal 8119 1726773070.54396: checking for max_fail_percentage 8119 1726773070.54398: done checking for max_fail_percentage 8119 1726773070.54399: checking to see if all hosts have failed and the running result is not ok 8119 1726773070.54400: done checking to see if all hosts have failed 8119 1726773070.54402: getting the remaining hosts for this loop 8119 1726773070.54404: done getting the remaining hosts for this loop 8119 1726773070.54409: building list of next tasks for hosts 8119 1726773070.54411: getting the next task for host managed_node2 8119 1726773070.54417: done getting next task for host managed_node2 8119 1726773070.54420: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773070.54423: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.54425: done building task lists 8119 1726773070.54426: counting tasks in each state of execution 8119 1726773070.54429: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773070.54430: advancing hosts in ITERATING_TASKS 8119 1726773070.54432: starting to advance hosts 8119 1726773070.54433: getting the next task for host managed_node2 8119 1726773070.54436: done getting next task for host managed_node2 8119 1726773070.54438: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773070.54440: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.54441: done advancing hosts to next task 8119 1726773070.54453: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773070.54455: getting variables 8119 1726773070.54457: in VariableManager get_vars() 8119 1726773070.54484: Calling all_inventory to load vars for managed_node2 8119 1726773070.54489: Calling groups_inventory to load vars for managed_node2 8119 1726773070.54493: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773070.54518: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54530: Calling all_plugins_play to load vars for managed_node2 8119 1726773070.54540: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54549: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773070.54559: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54565: Calling groups_plugins_play to load vars for managed_node2 8119 1726773070.54574: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54596: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54612: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.54835: done with get_vars() 8119 1726773070.54848: done getting variables 8119 1726773070.54854: sending task start callback, copying the task so we can template it temporarily 8119 1726773070.54855: done copying, going to template now 8119 1726773070.54857: done templating 8119 1726773070.54858: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.397) 0:01:05.105 **** 8119 1726773070.54874: sending task start callback 8119 1726773070.54876: entering _queue_task() for managed_node2/set_fact 8119 1726773070.54986: worker is 1 (out of 1 available) 8119 1726773070.55025: exiting _queue_task() for managed_node2/set_fact 8119 1726773070.55098: done queuing things up, now waiting for results queue to drain 8119 1726773070.55103: waiting for pending results... 10721 1726773070.55165: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10721 1726773070.55218: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f6 10721 1726773070.55265: calling self._execute() 10721 1726773070.56993: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10721 1726773070.57094: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10721 1726773070.57147: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10721 1726773070.57175: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10721 1726773070.57207: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10721 1726773070.57235: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10721 1726773070.57278: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10721 1726773070.57306: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10721 1726773070.57324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10721 1726773070.57405: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10721 1726773070.57425: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10721 1726773070.57439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10721 1726773070.57765: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10721 1726773070.57800: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10721 1726773070.57813: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10721 1726773070.57823: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10721 1726773070.57830: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10721 1726773070.57936: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10721 1726773070.57954: starting attempt loop 10721 1726773070.57958: running the handler 10721 1726773070.57970: handler run complete 10721 1726773070.57973: attempt loop complete, returning result 10721 1726773070.57975: _execute() done 10721 1726773070.57976: dumping result to json 10721 1726773070.57978: done dumping result, returning 10721 1726773070.57984: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [12a3200b-1e9d-1dbd-cc52-0000000002f6] 10721 1726773070.57991: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f6 10721 1726773070.58022: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f6 10721 1726773070.58026: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8119 1726773070.58213: no more pending results, returning what we have 8119 1726773070.58219: results queue empty 8119 1726773070.58221: checking for any_errors_fatal 8119 1726773070.58227: done checking for any_errors_fatal 8119 1726773070.58229: checking for max_fail_percentage 8119 1726773070.58232: done checking for max_fail_percentage 8119 1726773070.58234: checking to see if all hosts have failed and the running result is not ok 8119 1726773070.58236: done checking to see if all hosts have failed 8119 1726773070.58237: getting the remaining hosts for this loop 8119 1726773070.58240: done getting the remaining hosts for this loop 8119 1726773070.58247: building list of next tasks for hosts 8119 1726773070.58250: getting the next task for host managed_node2 8119 1726773070.58257: done getting next task for host managed_node2 8119 1726773070.58260: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773070.58263: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.58265: done building task lists 8119 1726773070.58266: counting tasks in each state of execution 8119 1726773070.58268: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773070.58270: advancing hosts in ITERATING_TASKS 8119 1726773070.58271: starting to advance hosts 8119 1726773070.58273: getting the next task for host managed_node2 8119 1726773070.58275: done getting next task for host managed_node2 8119 1726773070.58277: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773070.58279: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773070.58280: done advancing hosts to next task 8119 1726773070.58294: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773070.58299: getting variables 8119 1726773070.58301: in VariableManager get_vars() 8119 1726773070.58331: Calling all_inventory to load vars for managed_node2 8119 1726773070.58334: Calling groups_inventory to load vars for managed_node2 8119 1726773070.58337: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773070.58356: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58366: Calling all_plugins_play to load vars for managed_node2 8119 1726773070.58376: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58386: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773070.58398: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58407: Calling groups_plugins_play to load vars for managed_node2 8119 1726773070.58421: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58440: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58454: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773070.58661: done with get_vars() 8119 1726773070.58671: done getting variables 8119 1726773070.58675: sending task start callback, copying the task so we can template it temporarily 8119 1726773070.58677: done copying, going to template now 8119 1726773070.58679: done templating 8119 1726773070.58680: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.038) 0:01:05.143 **** 8119 1726773070.58697: sending task start callback 8119 1726773070.58699: entering _queue_task() for managed_node2/copy 8119 1726773070.58816: worker is 1 (out of 1 available) 8119 1726773070.58852: exiting _queue_task() for managed_node2/copy 8119 1726773070.58924: done queuing things up, now waiting for results queue to drain 8119 1726773070.58929: waiting for pending results... 10723 1726773070.58984: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10723 1726773070.59033: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f7 10723 1726773070.59078: calling self._execute() 10723 1726773070.60808: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10723 1726773070.60889: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10723 1726773070.60943: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10723 1726773070.60969: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10723 1726773070.60998: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10723 1726773070.61029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10723 1726773070.61071: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10723 1726773070.61096: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10723 1726773070.61115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10723 1726773070.61273: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10723 1726773070.61295: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10723 1726773070.61310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10723 1726773070.61574: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10723 1726773070.61609: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10723 1726773070.61622: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10723 1726773070.61634: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10723 1726773070.61639: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10723 1726773070.61731: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10723 1726773070.61741: starting attempt loop 10723 1726773070.61743: running the handler 10723 1726773070.61751: _low_level_execute_command(): starting 10723 1726773070.61754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10723 1726773070.64246: stdout chunk (state=2): >>>/root <<< 10723 1726773070.64367: stderr chunk (state=3): >>><<< 10723 1726773070.64375: stdout chunk (state=3): >>><<< 10723 1726773070.64398: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10723 1726773070.64413: _low_level_execute_command(): starting 10723 1726773070.64418: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557 `" && echo ansible-tmp-1726773070.64407-10723-133122121307557="` echo /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557 `" ) && sleep 0' 10723 1726773070.67204: stdout chunk (state=2): >>>ansible-tmp-1726773070.64407-10723-133122121307557=/root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557 <<< 10723 1726773070.67334: stderr chunk (state=3): >>><<< 10723 1726773070.67339: stdout chunk (state=3): >>><<< 10723 1726773070.67357: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773070.64407-10723-133122121307557=/root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557 , stderr= 10723 1726773070.67497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10723 1726773070.67553: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_stat.py 10723 1726773070.67837: Sending initial data 10723 1726773070.67853: Sent initial data (150 bytes) 10723 1726773070.70750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpswq_ykv5 /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_stat.py <<< 10723 1726773070.72654: stderr chunk (state=3): >>><<< 10723 1726773070.72662: stdout chunk (state=3): >>><<< 10723 1726773070.72695: done transferring module to remote 10723 1726773070.72715: _low_level_execute_command(): starting 10723 1726773070.72723: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/ /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_stat.py && sleep 0' 10723 1726773070.75709: stderr chunk (state=2): >>><<< 10723 1726773070.75724: stdout chunk (state=2): >>><<< 10723 1726773070.75748: _low_level_execute_command() done: rc=0, stdout=, stderr= 10723 1726773070.75752: _low_level_execute_command(): starting 10723 1726773070.75760: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_stat.py && sleep 0' 10723 1726773070.91859: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773070.1956651, "mtime": 1726773056.9126449, "ctime": 1726773056.9126449, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 10723 1726773070.92938: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10723 1726773070.92989: stderr chunk (state=3): >>><<< 10723 1726773070.93000: stdout chunk (state=3): >>><<< 10723 1726773070.93018: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773070.1956651, "mtime": 1726773056.9126449, "ctime": 1726773056.9126449, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 10723 1726773070.93078: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10723 1726773070.93172: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 10723 1726773070.93226: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_file.py 10723 1726773070.93544: Sending initial data 10723 1726773070.93560: Sent initial data (150 bytes) 10723 1726773070.96069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpusjl_5g0 /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_file.py <<< 10723 1726773070.97092: stderr chunk (state=3): >>><<< 10723 1726773070.97097: stdout chunk (state=3): >>><<< 10723 1726773070.97119: done transferring module to remote 10723 1726773070.97131: _low_level_execute_command(): starting 10723 1726773070.97135: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/ /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_file.py && sleep 0' 10723 1726773070.99675: stderr chunk (state=2): >>><<< 10723 1726773070.99690: stdout chunk (state=2): >>><<< 10723 1726773070.99708: _low_level_execute_command() done: rc=0, stdout=, stderr= 10723 1726773070.99714: _low_level_execute_command(): starting 10723 1726773070.99720: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/AnsiballZ_file.py && sleep 0' 10723 1726773071.15145: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp8t9xfp_2", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 10723 1726773071.16266: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10723 1726773071.16275: stdout chunk (state=3): >>><<< 10723 1726773071.16287: stderr chunk (state=3): >>><<< 10723 1726773071.16305: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp8t9xfp_2", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10723 1726773071.16349: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp8t9xfp_2', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10723 1726773071.16372: _low_level_execute_command(): starting 10723 1726773071.16381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773070.64407-10723-133122121307557/ > /dev/null 2>&1 && sleep 0' 10723 1726773071.19198: stderr chunk (state=2): >>><<< 10723 1726773071.19210: stdout chunk (state=2): >>><<< 10723 1726773071.19236: _low_level_execute_command() done: rc=0, stdout=, stderr= 10723 1726773071.19249: handler run complete 10723 1726773071.19282: attempt loop complete, returning result 10723 1726773071.19298: _execute() done 10723 1726773071.19300: dumping result to json 10723 1726773071.19303: done dumping result, returning 10723 1726773071.19317: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [12a3200b-1e9d-1dbd-cc52-0000000002f7] 10723 1726773071.19330: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f7 10723 1726773071.19371: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f7 10723 1726773071.19415: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8119 1726773071.19580: no more pending results, returning what we have 8119 1726773071.19588: results queue empty 8119 1726773071.19590: checking for any_errors_fatal 8119 1726773071.19595: done checking for any_errors_fatal 8119 1726773071.19597: checking for max_fail_percentage 8119 1726773071.19600: done checking for max_fail_percentage 8119 1726773071.19602: checking to see if all hosts have failed and the running result is not ok 8119 1726773071.19604: done checking to see if all hosts have failed 8119 1726773071.19605: getting the remaining hosts for this loop 8119 1726773071.19608: done getting the remaining hosts for this loop 8119 1726773071.19618: building list of next tasks for hosts 8119 1726773071.19621: getting the next task for host managed_node2 8119 1726773071.19628: done getting next task for host managed_node2 8119 1726773071.19632: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773071.19636: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773071.19638: done building task lists 8119 1726773071.19640: counting tasks in each state of execution 8119 1726773071.19644: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773071.19646: advancing hosts in ITERATING_TASKS 8119 1726773071.19648: starting to advance hosts 8119 1726773071.19650: getting the next task for host managed_node2 8119 1726773071.19654: done getting next task for host managed_node2 8119 1726773071.19657: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773071.19659: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773071.19661: done advancing hosts to next task 8119 1726773071.19676: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773071.19680: getting variables 8119 1726773071.19682: in VariableManager get_vars() 8119 1726773071.19713: Calling all_inventory to load vars for managed_node2 8119 1726773071.19717: Calling groups_inventory to load vars for managed_node2 8119 1726773071.19719: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773071.19742: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.19752: Calling all_plugins_play to load vars for managed_node2 8119 1726773071.19762: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.19770: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773071.19780: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.19791: Calling groups_plugins_play to load vars for managed_node2 8119 1726773071.19802: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.19822: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.19836: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.20053: done with get_vars() 8119 1726773071.20063: done getting variables 8119 1726773071.20068: sending task start callback, copying the task so we can template it temporarily 8119 1726773071.20070: done copying, going to template now 8119 1726773071.20072: done templating 8119 1726773071.20073: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.613) 0:01:05.757 **** 8119 1726773071.20091: sending task start callback 8119 1726773071.20093: entering _queue_task() for managed_node2/copy 8119 1726773071.20215: worker is 1 (out of 1 available) 8119 1726773071.20253: exiting _queue_task() for managed_node2/copy 8119 1726773071.20329: done queuing things up, now waiting for results queue to drain 8119 1726773071.20335: waiting for pending results... 10752 1726773071.20392: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10752 1726773071.20441: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f8 10752 1726773071.20486: calling self._execute() 10752 1726773071.22212: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10752 1726773071.22315: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10752 1726773071.22368: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10752 1726773071.22397: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10752 1726773071.22428: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10752 1726773071.22457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10752 1726773071.22502: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10752 1726773071.22526: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10752 1726773071.22547: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10752 1726773071.22626: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10752 1726773071.22644: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10752 1726773071.22661: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10752 1726773071.22891: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10752 1726773071.22928: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10752 1726773071.22940: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10752 1726773071.22950: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10752 1726773071.22955: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10752 1726773071.23056: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10752 1726773071.23073: starting attempt loop 10752 1726773071.23075: running the handler 10752 1726773071.23085: _low_level_execute_command(): starting 10752 1726773071.23090: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10752 1726773071.25532: stdout chunk (state=2): >>>/root <<< 10752 1726773071.25649: stderr chunk (state=3): >>><<< 10752 1726773071.25654: stdout chunk (state=3): >>><<< 10752 1726773071.25673: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10752 1726773071.25688: _low_level_execute_command(): starting 10752 1726773071.25694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666 `" && echo ansible-tmp-1726773071.256807-10752-244284090468666="` echo /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666 `" ) && sleep 0' 10752 1726773071.28531: stdout chunk (state=2): >>>ansible-tmp-1726773071.256807-10752-244284090468666=/root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666 <<< 10752 1726773071.28659: stderr chunk (state=3): >>><<< 10752 1726773071.28665: stdout chunk (state=3): >>><<< 10752 1726773071.28686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773071.256807-10752-244284090468666=/root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666 , stderr= 10752 1726773071.28834: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10752 1726773071.28893: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_stat.py 10752 1726773071.29184: Sending initial data 10752 1726773071.29199: Sent initial data (151 bytes) 10752 1726773071.31668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp9fujy29v /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_stat.py <<< 10752 1726773071.32680: stderr chunk (state=3): >>><<< 10752 1726773071.32689: stdout chunk (state=3): >>><<< 10752 1726773071.32720: done transferring module to remote 10752 1726773071.32739: _low_level_execute_command(): starting 10752 1726773071.32746: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/ /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_stat.py && sleep 0' 10752 1726773071.35543: stderr chunk (state=2): >>><<< 10752 1726773071.35558: stdout chunk (state=2): >>><<< 10752 1726773071.35580: _low_level_execute_command() done: rc=0, stdout=, stderr= 10752 1726773071.35591: _low_level_execute_command(): starting 10752 1726773071.35603: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_stat.py && sleep 0' 10752 1726773071.51396: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773056.7866454, "mtime": 1726773056.9146447, "ctime": 1726773056.9146447, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 10752 1726773071.52590: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10752 1726773071.52605: stdout chunk (state=3): >>><<< 10752 1726773071.52619: stderr chunk (state=3): >>><<< 10752 1726773071.52638: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773056.7866454, "mtime": 1726773056.9146447, "ctime": 1726773056.9146447, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 10752 1726773071.52730: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10752 1726773071.52849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 10752 1726773071.52909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_file.py 10752 1726773071.53651: Sending initial data 10752 1726773071.53666: Sent initial data (151 bytes) 10752 1726773071.57149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpc4y7py_7 /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_file.py <<< 10752 1726773071.58389: stderr chunk (state=3): >>><<< 10752 1726773071.58397: stdout chunk (state=3): >>><<< 10752 1726773071.58423: done transferring module to remote 10752 1726773071.58436: _low_level_execute_command(): starting 10752 1726773071.58440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/ /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_file.py && sleep 0' 10752 1726773071.61038: stderr chunk (state=2): >>><<< 10752 1726773071.61048: stdout chunk (state=2): >>><<< 10752 1726773071.61065: _low_level_execute_command() done: rc=0, stdout=, stderr= 10752 1726773071.61068: _low_level_execute_command(): starting 10752 1726773071.61074: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/AnsiballZ_file.py && sleep 0' 10752 1726773071.77304: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpgl6wswk5", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 10752 1726773071.78422: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10752 1726773071.78476: stderr chunk (state=3): >>><<< 10752 1726773071.78482: stdout chunk (state=3): >>><<< 10752 1726773071.78504: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpgl6wswk5", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10752 1726773071.78538: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpgl6wswk5', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10752 1726773071.78551: _low_level_execute_command(): starting 10752 1726773071.78555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773071.256807-10752-244284090468666/ > /dev/null 2>&1 && sleep 0' 10752 1726773071.81221: stderr chunk (state=2): >>><<< 10752 1726773071.81232: stdout chunk (state=2): >>><<< 10752 1726773071.81250: _low_level_execute_command() done: rc=0, stdout=, stderr= 10752 1726773071.81264: handler run complete 10752 1726773071.81300: attempt loop complete, returning result 10752 1726773071.81317: _execute() done 10752 1726773071.81320: dumping result to json 10752 1726773071.81324: done dumping result, returning 10752 1726773071.81336: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [12a3200b-1e9d-1dbd-cc52-0000000002f8] 10752 1726773071.81349: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f8 10752 1726773071.81388: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f8 10752 1726773071.81431: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8119 1726773071.81622: no more pending results, returning what we have 8119 1726773071.81628: results queue empty 8119 1726773071.81631: checking for any_errors_fatal 8119 1726773071.81636: done checking for any_errors_fatal 8119 1726773071.81638: checking for max_fail_percentage 8119 1726773071.81641: done checking for max_fail_percentage 8119 1726773071.81643: checking to see if all hosts have failed and the running result is not ok 8119 1726773071.81645: done checking to see if all hosts have failed 8119 1726773071.81646: getting the remaining hosts for this loop 8119 1726773071.81649: done getting the remaining hosts for this loop 8119 1726773071.81657: building list of next tasks for hosts 8119 1726773071.81659: getting the next task for host managed_node2 8119 1726773071.81666: done getting next task for host managed_node2 8119 1726773071.81670: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773071.81673: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773071.81674: done building task lists 8119 1726773071.81675: counting tasks in each state of execution 8119 1726773071.81678: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773071.81679: advancing hosts in ITERATING_TASKS 8119 1726773071.81681: starting to advance hosts 8119 1726773071.81684: getting the next task for host managed_node2 8119 1726773071.81689: done getting next task for host managed_node2 8119 1726773071.81692: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773071.81695: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773071.81696: done advancing hosts to next task 8119 1726773071.81734: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773071.81739: getting variables 8119 1726773071.81741: in VariableManager get_vars() 8119 1726773071.81768: Calling all_inventory to load vars for managed_node2 8119 1726773071.81772: Calling groups_inventory to load vars for managed_node2 8119 1726773071.81774: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773071.81797: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.81813: Calling all_plugins_play to load vars for managed_node2 8119 1726773071.81827: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.81836: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773071.81846: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.81852: Calling groups_plugins_play to load vars for managed_node2 8119 1726773071.81861: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.81880: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.81898: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773071.82111: done with get_vars() 8119 1726773071.82122: done getting variables 8119 1726773071.82127: sending task start callback, copying the task so we can template it temporarily 8119 1726773071.82128: done copying, going to template now 8119 1726773071.82130: done templating 8119 1726773071.82132: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.620) 0:01:06.377 **** 8119 1726773071.82150: sending task start callback 8119 1726773071.82152: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773071.82269: worker is 1 (out of 1 available) 8119 1726773071.82309: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773071.82380: done queuing things up, now waiting for results queue to drain 8119 1726773071.82388: waiting for pending results... 10777 1726773071.82443: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10777 1726773071.82495: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002f9 10777 1726773071.82542: calling self._execute() 10777 1726773071.84324: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10777 1726773071.84409: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10777 1726773071.84459: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10777 1726773071.84504: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10777 1726773071.84533: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10777 1726773071.84562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10777 1726773071.84606: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10777 1726773071.84633: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10777 1726773071.84649: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10777 1726773071.84729: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10777 1726773071.84748: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10777 1726773071.84762: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10777 1726773071.85051: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10777 1726773071.85090: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10777 1726773071.85101: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10777 1726773071.85114: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10777 1726773071.85120: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10777 1726773071.85202: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10777 1726773071.85217: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10777 1726773071.85242: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10777 1726773071.85258: starting attempt loop 10777 1726773071.85260: running the handler 10777 1726773071.85268: _low_level_execute_command(): starting 10777 1726773071.85272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10777 1726773071.87753: stdout chunk (state=2): >>>/root <<< 10777 1726773071.87867: stderr chunk (state=3): >>><<< 10777 1726773071.87872: stdout chunk (state=3): >>><<< 10777 1726773071.87896: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10777 1726773071.87913: _low_level_execute_command(): starting 10777 1726773071.87919: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292 `" && echo ansible-tmp-1726773071.8790565-10777-44615583982292="` echo /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292 `" ) && sleep 0' 10777 1726773071.90704: stdout chunk (state=2): >>>ansible-tmp-1726773071.8790565-10777-44615583982292=/root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292 <<< 10777 1726773071.90838: stderr chunk (state=3): >>><<< 10777 1726773071.90843: stdout chunk (state=3): >>><<< 10777 1726773071.90860: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773071.8790565-10777-44615583982292=/root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292 , stderr= 10777 1726773071.90940: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 10777 1726773071.90994: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/AnsiballZ_kernel_settings_get_config.py 10777 1726773071.91295: Sending initial data 10777 1726773071.91309: Sent initial data (173 bytes) 10777 1726773071.93755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpe3ocztsz /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/AnsiballZ_kernel_settings_get_config.py <<< 10777 1726773071.94748: stderr chunk (state=3): >>><<< 10777 1726773071.94753: stdout chunk (state=3): >>><<< 10777 1726773071.94773: done transferring module to remote 10777 1726773071.94788: _low_level_execute_command(): starting 10777 1726773071.94792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/ /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10777 1726773071.97316: stderr chunk (state=2): >>><<< 10777 1726773071.97329: stdout chunk (state=2): >>><<< 10777 1726773071.97347: _low_level_execute_command() done: rc=0, stdout=, stderr= 10777 1726773071.97351: _low_level_execute_command(): starting 10777 1726773071.97357: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10777 1726773072.12453: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10777 1726773072.13459: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10777 1726773072.13511: stderr chunk (state=3): >>><<< 10777 1726773072.13520: stdout chunk (state=3): >>><<< 10777 1726773072.13541: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 10777 1726773072.13599: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10777 1726773072.13615: _low_level_execute_command(): starting 10777 1726773072.13623: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773071.8790565-10777-44615583982292/ > /dev/null 2>&1 && sleep 0' 10777 1726773072.16275: stderr chunk (state=2): >>><<< 10777 1726773072.16288: stdout chunk (state=2): >>><<< 10777 1726773072.16306: _low_level_execute_command() done: rc=0, stdout=, stderr= 10777 1726773072.16317: handler run complete 10777 1726773072.16346: attempt loop complete, returning result 10777 1726773072.16360: _execute() done 10777 1726773072.16361: dumping result to json 10777 1726773072.16364: done dumping result, returning 10777 1726773072.16377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [12a3200b-1e9d-1dbd-cc52-0000000002f9] 10777 1726773072.16393: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f9 10777 1726773072.16433: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002f9 10777 1726773072.16439: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8119 1726773072.16644: no more pending results, returning what we have 8119 1726773072.16649: results queue empty 8119 1726773072.16651: checking for any_errors_fatal 8119 1726773072.16657: done checking for any_errors_fatal 8119 1726773072.16659: checking for max_fail_percentage 8119 1726773072.16662: done checking for max_fail_percentage 8119 1726773072.16664: checking to see if all hosts have failed and the running result is not ok 8119 1726773072.16666: done checking to see if all hosts have failed 8119 1726773072.16668: getting the remaining hosts for this loop 8119 1726773072.16670: done getting the remaining hosts for this loop 8119 1726773072.16678: building list of next tasks for hosts 8119 1726773072.16681: getting the next task for host managed_node2 8119 1726773072.16691: done getting next task for host managed_node2 8119 1726773072.16695: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773072.16698: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.16701: done building task lists 8119 1726773072.16703: counting tasks in each state of execution 8119 1726773072.16706: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773072.16709: advancing hosts in ITERATING_TASKS 8119 1726773072.16711: starting to advance hosts 8119 1726773072.16713: getting the next task for host managed_node2 8119 1726773072.16717: done getting next task for host managed_node2 8119 1726773072.16720: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773072.16722: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.16723: done advancing hosts to next task 8119 1726773072.16736: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773072.16739: getting variables 8119 1726773072.16741: in VariableManager get_vars() 8119 1726773072.16768: Calling all_inventory to load vars for managed_node2 8119 1726773072.16771: Calling groups_inventory to load vars for managed_node2 8119 1726773072.16773: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773072.16800: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.16813: Calling all_plugins_play to load vars for managed_node2 8119 1726773072.16824: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.16833: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773072.16844: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.16850: Calling groups_plugins_play to load vars for managed_node2 8119 1726773072.16860: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.16877: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.16894: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.17102: done with get_vars() 8119 1726773072.17112: done getting variables 8119 1726773072.17118: sending task start callback, copying the task so we can template it temporarily 8119 1726773072.17120: done copying, going to template now 8119 1726773072.17122: done templating 8119 1726773072.17124: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.349) 0:01:06.727 **** 8119 1726773072.17143: sending task start callback 8119 1726773072.17145: entering _queue_task() for managed_node2/template 8119 1726773072.17260: worker is 1 (out of 1 available) 8119 1726773072.17300: exiting _queue_task() for managed_node2/template 8119 1726773072.17371: done queuing things up, now waiting for results queue to drain 8119 1726773072.17376: waiting for pending results... 10786 1726773072.17442: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10786 1726773072.17494: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002fa 10786 1726773072.17543: calling self._execute() 10786 1726773072.19325: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10786 1726773072.19409: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10786 1726773072.19469: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10786 1726773072.19505: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10786 1726773072.19536: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10786 1726773072.19567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10786 1726773072.19617: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10786 1726773072.19641: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10786 1726773072.19656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10786 1726773072.19740: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10786 1726773072.19756: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10786 1726773072.19770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10786 1726773072.20175: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10786 1726773072.20210: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10786 1726773072.20222: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10786 1726773072.20235: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10786 1726773072.20242: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10786 1726773072.20338: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10786 1726773072.20355: starting attempt loop 10786 1726773072.20359: running the handler 10786 1726773072.20367: _low_level_execute_command(): starting 10786 1726773072.20371: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10786 1726773072.22851: stdout chunk (state=2): >>>/root <<< 10786 1726773072.22966: stderr chunk (state=3): >>><<< 10786 1726773072.22972: stdout chunk (state=3): >>><<< 10786 1726773072.22996: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10786 1726773072.23011: _low_level_execute_command(): starting 10786 1726773072.23019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035 `" && echo ansible-tmp-1726773072.2300498-10786-195193794685035="` echo /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035 `" ) && sleep 0' 10786 1726773072.26098: stdout chunk (state=2): >>>ansible-tmp-1726773072.2300498-10786-195193794685035=/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035 <<< 10786 1726773072.26221: stderr chunk (state=3): >>><<< 10786 1726773072.26227: stdout chunk (state=3): >>><<< 10786 1726773072.26248: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773072.2300498-10786-195193794685035=/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035 , stderr= 10786 1726773072.26272: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10786 1726773072.26291: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10786 1726773072.27749: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.27754: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.27757: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.27759: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.27761: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.27763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.27766: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.27770: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.27772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.27793: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.27796: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.27798: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28059: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28065: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.28068: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.28070: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.28072: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.28074: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28075: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.28077: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.28079: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.28096: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28099: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.28101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28131: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28135: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.28137: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.28139: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.28141: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.28142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28145: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.28146: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.28148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.28165: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28168: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.28171: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28335: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28340: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.28342: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.28344: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.28346: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.28348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28349: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.28351: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.28353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.28366: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28369: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.28370: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28609: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28617: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.28620: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.28622: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.28623: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.28625: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28627: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.28629: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.28631: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.28646: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28650: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.28653: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28682: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28688: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10786 1726773072.28690: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10786 1726773072.28692: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10786 1726773072.28694: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10786 1726773072.28696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.28698: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10786 1726773072.28699: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10786 1726773072.28701: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10786 1726773072.28717: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10786 1726773072.28720: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10786 1726773072.28723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10786 1726773072.29800: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10786 1726773072.29871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10786 1726773072.29906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_stat.py 10786 1726773072.30224: Sending initial data 10786 1726773072.30239: Sent initial data (152 bytes) 10786 1726773072.32786: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8k36gh_p /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_stat.py <<< 10786 1726773072.33789: stderr chunk (state=3): >>><<< 10786 1726773072.33794: stdout chunk (state=3): >>><<< 10786 1726773072.33815: done transferring module to remote 10786 1726773072.33828: _low_level_execute_command(): starting 10786 1726773072.33833: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/ /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_stat.py && sleep 0' 10786 1726773072.36357: stderr chunk (state=2): >>><<< 10786 1726773072.36367: stdout chunk (state=2): >>><<< 10786 1726773072.36387: _low_level_execute_command() done: rc=0, stdout=, stderr= 10786 1726773072.36391: _low_level_execute_command(): starting 10786 1726773072.36397: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_stat.py && sleep 0' 10786 1726773072.52225: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 318767235, "dev": 51713, "nlink": 1, "atime": 1726773035.2903237, "mtime": 1726773034.2293258, "ctime": 1726773034.4823253, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "687458791", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 10786 1726773072.53310: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10786 1726773072.53361: stderr chunk (state=3): >>><<< 10786 1726773072.53366: stdout chunk (state=3): >>><<< 10786 1726773072.53388: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 318767235, "dev": 51713, "nlink": 1, "atime": 1726773035.2903237, "mtime": 1726773034.2293258, "ctime": 1726773034.4823253, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "687458791", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 10786 1726773072.53448: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10786 1726773072.53796: Sending initial data 10786 1726773072.53815: Sent initial data (160 bytes) 10786 1726773072.56766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpxdo6lpen/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source <<< 10786 1726773072.57081: stderr chunk (state=3): >>><<< 10786 1726773072.57089: stdout chunk (state=3): >>><<< 10786 1726773072.57117: _low_level_execute_command(): starting 10786 1726773072.57124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/ /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source && sleep 0' 10786 1726773072.59646: stderr chunk (state=2): >>><<< 10786 1726773072.59657: stdout chunk (state=2): >>><<< 10786 1726773072.59677: _low_level_execute_command() done: rc=0, stdout=, stderr= 10786 1726773072.59785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 10786 1726773072.59837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_copy.py 10786 1726773072.60109: Sending initial data 10786 1726773072.60123: Sent initial data (152 bytes) 10786 1726773072.62568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp5wszc89p /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_copy.py <<< 10786 1726773072.63614: stderr chunk (state=3): >>><<< 10786 1726773072.63619: stdout chunk (state=3): >>><<< 10786 1726773072.63643: done transferring module to remote 10786 1726773072.63655: _low_level_execute_command(): starting 10786 1726773072.63659: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/ /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_copy.py && sleep 0' 10786 1726773072.66201: stderr chunk (state=2): >>><<< 10786 1726773072.66217: stdout chunk (state=2): >>><<< 10786 1726773072.66237: _low_level_execute_command() done: rc=0, stdout=, stderr= 10786 1726773072.66240: _low_level_execute_command(): starting 10786 1726773072.66246: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/AnsiballZ_copy.py && sleep 0' 10786 1726773072.82610: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 10786 1726773072.83719: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10786 1726773072.83770: stderr chunk (state=3): >>><<< 10786 1726773072.83776: stdout chunk (state=3): >>><<< 10786 1726773072.83799: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10786 1726773072.83835: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '221aa34fef95c2fe05408be9921820449785a5b2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10786 1726773072.83871: _low_level_execute_command(): starting 10786 1726773072.83878: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/ > /dev/null 2>&1 && sleep 0' 10786 1726773072.86540: stderr chunk (state=2): >>><<< 10786 1726773072.86553: stdout chunk (state=2): >>><<< 10786 1726773072.86574: _low_level_execute_command() done: rc=0, stdout=, stderr= 10786 1726773072.86600: handler run complete 10786 1726773072.86633: attempt loop complete, returning result 10786 1726773072.86638: _execute() done 10786 1726773072.86640: dumping result to json 10786 1726773072.86644: done dumping result, returning 10786 1726773072.86657: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [12a3200b-1e9d-1dbd-cc52-0000000002fa] 10786 1726773072.86672: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fa 10786 1726773072.86715: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fa 10786 1726773072.86767: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "src": "/root/.ansible/tmp/ansible-tmp-1726773072.2300498-10786-195193794685035/source", "state": "file", "uid": 0 } 8119 1726773072.86927: no more pending results, returning what we have 8119 1726773072.86933: results queue empty 8119 1726773072.86935: checking for any_errors_fatal 8119 1726773072.86939: done checking for any_errors_fatal 8119 1726773072.86941: checking for max_fail_percentage 8119 1726773072.86945: done checking for max_fail_percentage 8119 1726773072.86946: checking to see if all hosts have failed and the running result is not ok 8119 1726773072.86949: done checking to see if all hosts have failed 8119 1726773072.86951: getting the remaining hosts for this loop 8119 1726773072.86953: done getting the remaining hosts for this loop 8119 1726773072.86961: building list of next tasks for hosts 8119 1726773072.86964: getting the next task for host managed_node2 8119 1726773072.86971: done getting next task for host managed_node2 8119 1726773072.86975: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773072.86979: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.86981: done building task lists 8119 1726773072.86985: counting tasks in each state of execution 8119 1726773072.86989: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773072.86991: advancing hosts in ITERATING_TASKS 8119 1726773072.86993: starting to advance hosts 8119 1726773072.86995: getting the next task for host managed_node2 8119 1726773072.86999: done getting next task for host managed_node2 8119 1726773072.87002: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773072.87005: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.87007: done advancing hosts to next task 8119 1726773072.87025: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773072.87030: getting variables 8119 1726773072.87032: in VariableManager get_vars() 8119 1726773072.87085: Calling all_inventory to load vars for managed_node2 8119 1726773072.87092: Calling groups_inventory to load vars for managed_node2 8119 1726773072.87095: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773072.87126: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87139: Calling all_plugins_play to load vars for managed_node2 8119 1726773072.87149: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87158: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773072.87168: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87174: Calling groups_plugins_play to load vars for managed_node2 8119 1726773072.87186: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87213: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87229: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.87436: done with get_vars() 8119 1726773072.87446: done getting variables 8119 1726773072.87451: sending task start callback, copying the task so we can template it temporarily 8119 1726773072.87453: done copying, going to template now 8119 1726773072.87454: done templating 8119 1726773072.87456: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.703) 0:01:07.431 **** 8119 1726773072.87472: sending task start callback 8119 1726773072.87473: entering _queue_task() for managed_node2/service 8119 1726773072.87602: worker is 1 (out of 1 available) 8119 1726773072.87643: exiting _queue_task() for managed_node2/service 8119 1726773072.87719: done queuing things up, now waiting for results queue to drain 8119 1726773072.87725: waiting for pending results... 10805 1726773072.87779: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10805 1726773072.87833: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002fb 10805 1726773072.89499: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10805 1726773072.89603: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10805 1726773072.89656: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10805 1726773072.89682: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10805 1726773072.89717: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10805 1726773072.89747: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10805 1726773072.89793: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10805 1726773072.89820: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10805 1726773072.89837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10805 1726773072.89918: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10805 1726773072.89943: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10805 1726773072.89961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10805 1726773072.90172: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90178: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10805 1726773072.90180: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10805 1726773072.90182: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10805 1726773072.90186: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10805 1726773072.90188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90190: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10805 1726773072.90192: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10805 1726773072.90194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10805 1726773072.90213: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90216: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10805 1726773072.90218: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90364: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90369: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10805 1726773072.90371: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10805 1726773072.90373: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10805 1726773072.90375: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10805 1726773072.90376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90378: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10805 1726773072.90380: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10805 1726773072.90381: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10805 1726773072.90406: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90410: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10805 1726773072.90413: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90609: when evaluation is False, skipping this task 10805 1726773072.90645: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90649: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10805 1726773072.90651: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10805 1726773072.90653: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10805 1726773072.90655: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10805 1726773072.90658: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90661: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10805 1726773072.90663: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10805 1726773072.90666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10805 1726773072.90686: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10805 1726773072.90689: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10805 1726773072.90691: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10805 1726773072.90764: dumping result to json 10805 1726773072.90769: done dumping result, returning 10805 1726773072.90774: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [12a3200b-1e9d-1dbd-cc52-0000000002fb] 10805 1726773072.90785: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fb 10805 1726773072.90829: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fb 10805 1726773072.90833: WORKER PROCESS EXITING skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "item": "tuned", "skip_reason": "Conditional result was False" } 8119 1726773072.90981: no more pending results, returning what we have 8119 1726773072.90990: results queue empty 8119 1726773072.90992: checking for any_errors_fatal 8119 1726773072.91000: done checking for any_errors_fatal 8119 1726773072.91002: checking for max_fail_percentage 8119 1726773072.91005: done checking for max_fail_percentage 8119 1726773072.91007: checking to see if all hosts have failed and the running result is not ok 8119 1726773072.91009: done checking to see if all hosts have failed 8119 1726773072.91013: getting the remaining hosts for this loop 8119 1726773072.91016: done getting the remaining hosts for this loop 8119 1726773072.91023: building list of next tasks for hosts 8119 1726773072.91026: getting the next task for host managed_node2 8119 1726773072.91033: done getting next task for host managed_node2 8119 1726773072.91038: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773072.91042: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.91044: done building task lists 8119 1726773072.91046: counting tasks in each state of execution 8119 1726773072.91050: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773072.91052: advancing hosts in ITERATING_TASKS 8119 1726773072.91054: starting to advance hosts 8119 1726773072.91056: getting the next task for host managed_node2 8119 1726773072.91060: done getting next task for host managed_node2 8119 1726773072.91062: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773072.91066: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773072.91067: done advancing hosts to next task 8119 1726773072.91084: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773072.91089: getting variables 8119 1726773072.91091: in VariableManager get_vars() 8119 1726773072.91122: Calling all_inventory to load vars for managed_node2 8119 1726773072.91125: Calling groups_inventory to load vars for managed_node2 8119 1726773072.91128: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773072.91149: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91159: Calling all_plugins_play to load vars for managed_node2 8119 1726773072.91168: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91177: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773072.91191: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91201: Calling groups_plugins_play to load vars for managed_node2 8119 1726773072.91213: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91232: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91245: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773072.91452: done with get_vars() 8119 1726773072.91463: done getting variables 8119 1726773072.91468: sending task start callback, copying the task so we can template it temporarily 8119 1726773072.91469: done copying, going to template now 8119 1726773072.91471: done templating 8119 1726773072.91472: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.040) 0:01:07.471 **** 8119 1726773072.91491: sending task start callback 8119 1726773072.91494: entering _queue_task() for managed_node2/command 8119 1726773072.91620: worker is 1 (out of 1 available) 8119 1726773072.91656: exiting _queue_task() for managed_node2/command 8119 1726773072.91730: done queuing things up, now waiting for results queue to drain 8119 1726773072.91735: waiting for pending results... 10807 1726773072.91792: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10807 1726773072.91844: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002fc 10807 1726773072.91889: calling self._execute() 10807 1726773072.93652: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10807 1726773072.93731: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10807 1726773072.93782: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10807 1726773072.93811: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10807 1726773072.93838: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10807 1726773072.93869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10807 1726773072.93915: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10807 1726773072.93939: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10807 1726773072.93955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10807 1726773072.94034: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10807 1726773072.94050: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10807 1726773072.96537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10807 1726773072.97143: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10807 1726773072.97175: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10807 1726773072.97188: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10807 1726773072.97200: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10807 1726773072.97206: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10807 1726773072.97276: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10807 1726773072.97284: starting attempt loop 10807 1726773072.97287: running the handler 10807 1726773072.97293: _low_level_execute_command(): starting 10807 1726773072.97296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10807 1726773072.99816: stdout chunk (state=2): >>>/root <<< 10807 1726773072.99938: stderr chunk (state=3): >>><<< 10807 1726773072.99943: stdout chunk (state=3): >>><<< 10807 1726773072.99965: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10807 1726773072.99979: _low_level_execute_command(): starting 10807 1726773072.99987: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507 `" && echo ansible-tmp-1726773072.9997337-10807-116638347307507="` echo /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507 `" ) && sleep 0' 10807 1726773073.02789: stdout chunk (state=2): >>>ansible-tmp-1726773072.9997337-10807-116638347307507=/root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507 <<< 10807 1726773073.02917: stderr chunk (state=3): >>><<< 10807 1726773073.02924: stdout chunk (state=3): >>><<< 10807 1726773073.02947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773072.9997337-10807-116638347307507=/root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507 , stderr= 10807 1726773073.03057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10807 1726773073.03117: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/AnsiballZ_command.py 10807 1726773073.03405: Sending initial data 10807 1726773073.03423: Sent initial data (155 bytes) 10807 1726773073.05856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmplveyes7t /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/AnsiballZ_command.py <<< 10807 1726773073.06868: stderr chunk (state=3): >>><<< 10807 1726773073.06874: stdout chunk (state=3): >>><<< 10807 1726773073.06904: done transferring module to remote 10807 1726773073.06919: _low_level_execute_command(): starting 10807 1726773073.06924: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/ /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/AnsiballZ_command.py && sleep 0' 10807 1726773073.09507: stderr chunk (state=2): >>><<< 10807 1726773073.09518: stdout chunk (state=2): >>><<< 10807 1726773073.09536: _low_level_execute_command() done: rc=0, stdout=, stderr= 10807 1726773073.09540: _low_level_execute_command(): starting 10807 1726773073.09546: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/AnsiballZ_command.py && sleep 0' 10807 1726773074.32923: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:13.165955", "end": "2024-09-19 15:11:14.324154", "delta": "0:00:01.158199", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10807 1726773074.33840: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10807 1726773074.33890: stderr chunk (state=3): >>><<< 10807 1726773074.33897: stdout chunk (state=3): >>><<< 10807 1726773074.33921: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:13.165955", "end": "2024-09-19 15:11:14.324154", "delta": "0:00:01.158199", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10807 1726773074.33952: done with _execute_module (command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10807 1726773074.33963: _low_level_execute_command(): starting 10807 1726773074.33968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773072.9997337-10807-116638347307507/ > /dev/null 2>&1 && sleep 0' 10807 1726773074.36758: stderr chunk (state=2): >>><<< 10807 1726773074.36769: stdout chunk (state=2): >>><<< 10807 1726773074.36796: _low_level_execute_command() done: rc=0, stdout=, stderr= 10807 1726773074.36805: handler run complete 10807 1726773074.36816: attempt loop complete, returning result 10807 1726773074.36830: _execute() done 10807 1726773074.36832: dumping result to json 10807 1726773074.36836: done dumping result, returning 10807 1726773074.36849: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [12a3200b-1e9d-1dbd-cc52-0000000002fc] 10807 1726773074.36862: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fc 10807 1726773074.36920: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fc 10807 1726773074.36978: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.158199", "end": "2024-09-19 15:11:14.324154", "rc": 0, "start": "2024-09-19 15:11:13.165955" } 8119 1726773074.37139: no more pending results, returning what we have 8119 1726773074.37144: results queue empty 8119 1726773074.37146: checking for any_errors_fatal 8119 1726773074.37151: done checking for any_errors_fatal 8119 1726773074.37153: checking for max_fail_percentage 8119 1726773074.37156: done checking for max_fail_percentage 8119 1726773074.37158: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.37160: done checking to see if all hosts have failed 8119 1726773074.37162: getting the remaining hosts for this loop 8119 1726773074.37164: done getting the remaining hosts for this loop 8119 1726773074.37172: building list of next tasks for hosts 8119 1726773074.37174: getting the next task for host managed_node2 8119 1726773074.37182: done getting next task for host managed_node2 8119 1726773074.37188: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773074.37192: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.37195: done building task lists 8119 1726773074.37196: counting tasks in each state of execution 8119 1726773074.37200: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.37203: advancing hosts in ITERATING_TASKS 8119 1726773074.37205: starting to advance hosts 8119 1726773074.37207: getting the next task for host managed_node2 8119 1726773074.37213: done getting next task for host managed_node2 8119 1726773074.37216: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773074.37219: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.37221: done advancing hosts to next task 8119 1726773074.37236: getting variables 8119 1726773074.37240: in VariableManager get_vars() 8119 1726773074.37274: Calling all_inventory to load vars for managed_node2 8119 1726773074.37280: Calling groups_inventory to load vars for managed_node2 8119 1726773074.37286: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.37309: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37323: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.37334: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37342: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.37352: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37358: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.37367: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37389: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37408: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.37613: done with get_vars() 8119 1726773074.37626: done getting variables 8119 1726773074.37631: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.37632: done copying, going to template now 8119 1726773074.37634: done templating 8119 1726773074.37636: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:14 -0400 (0:00:01.461) 0:01:08.932 **** 8119 1726773074.37652: sending task start callback 8119 1726773074.37653: entering _queue_task() for managed_node2/include_tasks 8119 1726773074.37784: worker is 1 (out of 1 available) 8119 1726773074.37825: exiting _queue_task() for managed_node2/include_tasks 8119 1726773074.37897: done queuing things up, now waiting for results queue to drain 8119 1726773074.37903: waiting for pending results... 10828 1726773074.37957: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10828 1726773074.38015: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002fd 10828 1726773074.38061: calling self._execute() 10828 1726773074.39776: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10828 1726773074.39858: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10828 1726773074.39910: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10828 1726773074.39941: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10828 1726773074.39967: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10828 1726773074.39996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10828 1726773074.40093: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10828 1726773074.40118: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10828 1726773074.40137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10828 1726773074.40213: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10828 1726773074.40234: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10828 1726773074.40249: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10828 1726773074.40516: _execute() done 10828 1726773074.40521: dumping result to json 10828 1726773074.40523: done dumping result, returning 10828 1726773074.40526: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [12a3200b-1e9d-1dbd-cc52-0000000002fd] 10828 1726773074.40535: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fd 10828 1726773074.40562: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fd 10828 1726773074.40566: WORKER PROCESS EXITING 8119 1726773074.40752: no more pending results, returning what we have 8119 1726773074.40761: in VariableManager get_vars() 8119 1726773074.40807: Calling all_inventory to load vars for managed_node2 8119 1726773074.40814: Calling groups_inventory to load vars for managed_node2 8119 1726773074.40817: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.40839: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.40849: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.40859: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.40868: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.40878: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.40886: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.40897: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.40917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.40931: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.41143: done with get_vars() 8119 1726773074.41181: we have included files to process 8119 1726773074.41186: generating all_blocks data 8119 1726773074.41190: done generating all_blocks data 8119 1726773074.41194: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773074.41196: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773074.41200: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8119 1726773074.41414: done processing included file 8119 1726773074.41418: iterating over new_blocks loaded from include file 8119 1726773074.41421: in VariableManager get_vars() 8119 1726773074.41440: done with get_vars() 8119 1726773074.41442: filtering new block on tags 8119 1726773074.41498: done filtering new block on tags 8119 1726773074.41508: done iterating over new_blocks loaded from include file 8119 1726773074.41512: extending task lists for all hosts with included blocks 8119 1726773074.41840: done extending task lists 8119 1726773074.41844: done processing included files 8119 1726773074.41845: results queue empty 8119 1726773074.41847: checking for any_errors_fatal 8119 1726773074.41850: done checking for any_errors_fatal 8119 1726773074.41851: checking for max_fail_percentage 8119 1726773074.41853: done checking for max_fail_percentage 8119 1726773074.41854: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.41856: done checking to see if all hosts have failed 8119 1726773074.41857: getting the remaining hosts for this loop 8119 1726773074.41859: done getting the remaining hosts for this loop 8119 1726773074.41862: building list of next tasks for hosts 8119 1726773074.41864: getting the next task for host managed_node2 8119 1726773074.41869: done getting next task for host managed_node2 8119 1726773074.41873: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773074.41876: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.41878: done building task lists 8119 1726773074.41880: counting tasks in each state of execution 8119 1726773074.41882: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.41885: advancing hosts in ITERATING_TASKS 8119 1726773074.41887: starting to advance hosts 8119 1726773074.41888: getting the next task for host managed_node2 8119 1726773074.41891: done getting next task for host managed_node2 8119 1726773074.41893: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773074.41895: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.41897: done advancing hosts to next task 8119 1726773074.41902: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773074.41905: getting variables 8119 1726773074.41906: in VariableManager get_vars() 8119 1726773074.41921: Calling all_inventory to load vars for managed_node2 8119 1726773074.41924: Calling groups_inventory to load vars for managed_node2 8119 1726773074.41926: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.41940: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.41948: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.41957: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.41965: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.41975: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.41985: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.41999: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.42020: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.42034: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.42214: done with get_vars() 8119 1726773074.42224: done getting variables 8119 1726773074.42229: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.42231: done copying, going to template now 8119 1726773074.42232: done templating 8119 1726773074.42234: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.045) 0:01:08.978 **** 8119 1726773074.42249: sending task start callback 8119 1726773074.42250: entering _queue_task() for managed_node2/command 8119 1726773074.42367: worker is 1 (out of 1 available) 8119 1726773074.42405: exiting _queue_task() for managed_node2/command 8119 1726773074.42474: done queuing things up, now waiting for results queue to drain 8119 1726773074.42479: waiting for pending results... 10830 1726773074.42538: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 10830 1726773074.42597: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000469 10830 1726773074.42642: calling self._execute() 10830 1726773074.42812: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10830 1726773074.42857: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10830 1726773074.42869: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10830 1726773074.42880: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10830 1726773074.42889: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10830 1726773074.43010: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10830 1726773074.43032: starting attempt loop 10830 1726773074.43034: running the handler 10830 1726773074.43046: _low_level_execute_command(): starting 10830 1726773074.43053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10830 1726773074.45540: stdout chunk (state=2): >>>/root <<< 10830 1726773074.45656: stderr chunk (state=3): >>><<< 10830 1726773074.45662: stdout chunk (state=3): >>><<< 10830 1726773074.45685: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10830 1726773074.45701: _low_level_execute_command(): starting 10830 1726773074.45706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470 `" && echo ansible-tmp-1726773074.4569428-10830-75326517992470="` echo /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470 `" ) && sleep 0' 10830 1726773074.48537: stdout chunk (state=2): >>>ansible-tmp-1726773074.4569428-10830-75326517992470=/root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470 <<< 10830 1726773074.48654: stderr chunk (state=3): >>><<< 10830 1726773074.48659: stdout chunk (state=3): >>><<< 10830 1726773074.48678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773074.4569428-10830-75326517992470=/root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470 , stderr= 10830 1726773074.48817: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10830 1726773074.48874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/AnsiballZ_command.py 10830 1726773074.49172: Sending initial data 10830 1726773074.49190: Sent initial data (154 bytes) 10830 1726773074.51855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpo_mcvx_g /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/AnsiballZ_command.py <<< 10830 1726773074.53812: stderr chunk (state=3): >>><<< 10830 1726773074.53822: stdout chunk (state=3): >>><<< 10830 1726773074.53853: done transferring module to remote 10830 1726773074.53873: _low_level_execute_command(): starting 10830 1726773074.53880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/ /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/AnsiballZ_command.py && sleep 0' 10830 1726773074.57338: stderr chunk (state=2): >>><<< 10830 1726773074.57351: stdout chunk (state=2): >>><<< 10830 1726773074.57371: _low_level_execute_command() done: rc=0, stdout=, stderr= 10830 1726773074.57375: _low_level_execute_command(): starting 10830 1726773074.57386: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/AnsiballZ_command.py && sleep 0' 10830 1726773074.83153: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:14.725833", "end": "2024-09-19 15:11:14.829573", "delta": "0:00:00.103740", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10830 1726773074.84283: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10830 1726773074.84338: stderr chunk (state=3): >>><<< 10830 1726773074.84344: stdout chunk (state=3): >>><<< 10830 1726773074.84367: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:14.725833", "end": "2024-09-19 15:11:14.829573", "delta": "0:00:00.103740", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10830 1726773074.84401: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10830 1726773074.84413: _low_level_execute_command(): starting 10830 1726773074.84418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773074.4569428-10830-75326517992470/ > /dev/null 2>&1 && sleep 0' 10830 1726773074.87077: stderr chunk (state=2): >>><<< 10830 1726773074.87091: stdout chunk (state=2): >>><<< 10830 1726773074.87113: _low_level_execute_command() done: rc=0, stdout=, stderr= 10830 1726773074.87121: handler run complete 10830 1726773074.87163: attempt loop complete, returning result 10830 1726773074.87180: _execute() done 10830 1726773074.87182: dumping result to json 10830 1726773074.87188: done dumping result, returning 10830 1726773074.87200: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [12a3200b-1e9d-1dbd-cc52-000000000469] 10830 1726773074.87214: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000469 10830 1726773074.87253: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000469 10830 1726773074.87257: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.103740", "end": "2024-09-19 15:11:14.829573", "rc": 0, "start": "2024-09-19 15:11:14.725833" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773074.87507: no more pending results, returning what we have 8119 1726773074.87516: results queue empty 8119 1726773074.87519: checking for any_errors_fatal 8119 1726773074.87522: done checking for any_errors_fatal 8119 1726773074.87523: checking for max_fail_percentage 8119 1726773074.87525: done checking for max_fail_percentage 8119 1726773074.87527: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.87528: done checking to see if all hosts have failed 8119 1726773074.87529: getting the remaining hosts for this loop 8119 1726773074.87531: done getting the remaining hosts for this loop 8119 1726773074.87536: building list of next tasks for hosts 8119 1726773074.87538: getting the next task for host managed_node2 8119 1726773074.87544: done getting next task for host managed_node2 8119 1726773074.87546: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773074.87550: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.87552: done building task lists 8119 1726773074.87553: counting tasks in each state of execution 8119 1726773074.87556: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.87557: advancing hosts in ITERATING_TASKS 8119 1726773074.87559: starting to advance hosts 8119 1726773074.87560: getting the next task for host managed_node2 8119 1726773074.87563: done getting next task for host managed_node2 8119 1726773074.87565: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773074.87568: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.87572: done advancing hosts to next task 8119 1726773074.87593: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773074.87600: getting variables 8119 1726773074.87603: in VariableManager get_vars() 8119 1726773074.87634: Calling all_inventory to load vars for managed_node2 8119 1726773074.87638: Calling groups_inventory to load vars for managed_node2 8119 1726773074.87640: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.87663: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87673: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.87685: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87697: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.87712: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87719: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.87729: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87746: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87759: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.87966: done with get_vars() 8119 1726773074.87976: done getting variables 8119 1726773074.87981: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.87987: done copying, going to template now 8119 1726773074.87989: done templating 8119 1726773074.87990: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.457) 0:01:09.436 **** 8119 1726773074.88006: sending task start callback 8119 1726773074.88008: entering _queue_task() for managed_node2/shell 8119 1726773074.88133: worker is 1 (out of 1 available) 8119 1726773074.88171: exiting _queue_task() for managed_node2/shell 8119 1726773074.88246: done queuing things up, now waiting for results queue to drain 8119 1726773074.88251: waiting for pending results... 10855 1726773074.88313: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 10855 1726773074.88373: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000046a 10855 1726773074.88420: calling self._execute() 10855 1726773074.90472: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10855 1726773074.90560: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10855 1726773074.90615: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10855 1726773074.90642: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10855 1726773074.90671: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10855 1726773074.90703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10855 1726773074.90760: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10855 1726773074.90790: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10855 1726773074.90807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10855 1726773074.90887: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10855 1726773074.90906: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10855 1726773074.90922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10855 1726773074.91176: when evaluation is False, skipping this task 10855 1726773074.91180: _execute() done 10855 1726773074.91182: dumping result to json 10855 1726773074.91189: done dumping result, returning 10855 1726773074.91194: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [12a3200b-1e9d-1dbd-cc52-00000000046a] 10855 1726773074.91201: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000046a 10855 1726773074.91231: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000046a 10855 1726773074.91235: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773074.91421: no more pending results, returning what we have 8119 1726773074.91426: results queue empty 8119 1726773074.91429: checking for any_errors_fatal 8119 1726773074.91434: done checking for any_errors_fatal 8119 1726773074.91436: checking for max_fail_percentage 8119 1726773074.91439: done checking for max_fail_percentage 8119 1726773074.91441: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.91443: done checking to see if all hosts have failed 8119 1726773074.91445: getting the remaining hosts for this loop 8119 1726773074.91447: done getting the remaining hosts for this loop 8119 1726773074.91454: building list of next tasks for hosts 8119 1726773074.91457: getting the next task for host managed_node2 8119 1726773074.91464: done getting next task for host managed_node2 8119 1726773074.91468: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773074.91472: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.91474: done building task lists 8119 1726773074.91475: counting tasks in each state of execution 8119 1726773074.91478: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.91480: advancing hosts in ITERATING_TASKS 8119 1726773074.91481: starting to advance hosts 8119 1726773074.91484: getting the next task for host managed_node2 8119 1726773074.91488: done getting next task for host managed_node2 8119 1726773074.91490: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773074.91492: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.91493: done advancing hosts to next task 8119 1726773074.91504: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773074.91508: getting variables 8119 1726773074.91512: in VariableManager get_vars() 8119 1726773074.91538: Calling all_inventory to load vars for managed_node2 8119 1726773074.91542: Calling groups_inventory to load vars for managed_node2 8119 1726773074.91544: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.91563: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91576: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.91590: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91600: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.91613: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91620: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.91630: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91647: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91661: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.91872: done with get_vars() 8119 1726773074.91885: done getting variables 8119 1726773074.91891: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.91892: done copying, going to template now 8119 1726773074.91894: done templating 8119 1726773074.91896: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.039) 0:01:09.475 **** 8119 1726773074.91916: sending task start callback 8119 1726773074.91919: entering _queue_task() for managed_node2/fail 8119 1726773074.92030: worker is 1 (out of 1 available) 8119 1726773074.92065: exiting _queue_task() for managed_node2/fail 8119 1726773074.92139: done queuing things up, now waiting for results queue to drain 8119 1726773074.92144: waiting for pending results... 10858 1726773074.92205: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 10858 1726773074.92261: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000046b 10858 1726773074.92309: calling self._execute() 10858 1726773074.94547: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10858 1726773074.94666: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10858 1726773074.94744: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10858 1726773074.94788: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10858 1726773074.94831: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10858 1726773074.94889: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10858 1726773074.94949: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10858 1726773074.94984: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10858 1726773074.95010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10858 1726773074.95123: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10858 1726773074.95149: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10858 1726773074.95166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10858 1726773074.95480: when evaluation is False, skipping this task 10858 1726773074.95487: _execute() done 10858 1726773074.95490: dumping result to json 10858 1726773074.95493: done dumping result, returning 10858 1726773074.95499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [12a3200b-1e9d-1dbd-cc52-00000000046b] 10858 1726773074.95510: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000046b 10858 1726773074.95539: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000046b 10858 1726773074.95543: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773074.96037: no more pending results, returning what we have 8119 1726773074.96042: results queue empty 8119 1726773074.96045: checking for any_errors_fatal 8119 1726773074.96050: done checking for any_errors_fatal 8119 1726773074.96052: checking for max_fail_percentage 8119 1726773074.96055: done checking for max_fail_percentage 8119 1726773074.96058: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.96060: done checking to see if all hosts have failed 8119 1726773074.96062: getting the remaining hosts for this loop 8119 1726773074.96065: done getting the remaining hosts for this loop 8119 1726773074.96073: building list of next tasks for hosts 8119 1726773074.96076: getting the next task for host managed_node2 8119 1726773074.96087: done getting next task for host managed_node2 8119 1726773074.96094: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773074.96099: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.96102: done building task lists 8119 1726773074.96104: counting tasks in each state of execution 8119 1726773074.96108: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.96111: advancing hosts in ITERATING_TASKS 8119 1726773074.96113: starting to advance hosts 8119 1726773074.96115: getting the next task for host managed_node2 8119 1726773074.96121: done getting next task for host managed_node2 8119 1726773074.96125: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773074.96128: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.96131: done advancing hosts to next task 8119 1726773074.96147: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773074.96153: getting variables 8119 1726773074.96156: in VariableManager get_vars() 8119 1726773074.96196: Calling all_inventory to load vars for managed_node2 8119 1726773074.96203: Calling groups_inventory to load vars for managed_node2 8119 1726773074.96207: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.96237: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96254: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.96272: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96288: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.96308: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96320: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.96338: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96369: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96396: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.96741: done with get_vars() 8119 1726773074.96756: done getting variables 8119 1726773074.96764: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.96767: done copying, going to template now 8119 1726773074.96770: done templating 8119 1726773074.96773: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.048) 0:01:09.524 **** 8119 1726773074.96800: sending task start callback 8119 1726773074.96803: entering _queue_task() for managed_node2/set_fact 8119 1726773074.96954: worker is 1 (out of 1 available) 8119 1726773074.97005: exiting _queue_task() for managed_node2/set_fact 8119 1726773074.97080: done queuing things up, now waiting for results queue to drain 8119 1726773074.97089: waiting for pending results... 10862 1726773074.97389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 10862 1726773074.97455: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002fe 10862 1726773074.97512: calling self._execute() 10862 1726773074.97719: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10862 1726773074.97781: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10862 1726773074.97800: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10862 1726773074.97817: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10862 1726773074.97828: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10862 1726773074.97997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10862 1726773074.98024: starting attempt loop 10862 1726773074.98028: running the handler 10862 1726773074.98051: handler run complete 10862 1726773074.98057: attempt loop complete, returning result 10862 1726773074.98061: _execute() done 10862 1726773074.98063: dumping result to json 10862 1726773074.98067: done dumping result, returning 10862 1726773074.98072: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000002fe] 10862 1726773074.98082: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fe 10862 1726773074.98136: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002fe 10862 1726773074.98139: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8119 1726773074.98456: no more pending results, returning what we have 8119 1726773074.98460: results queue empty 8119 1726773074.98462: checking for any_errors_fatal 8119 1726773074.98466: done checking for any_errors_fatal 8119 1726773074.98467: checking for max_fail_percentage 8119 1726773074.98469: done checking for max_fail_percentage 8119 1726773074.98471: checking to see if all hosts have failed and the running result is not ok 8119 1726773074.98472: done checking to see if all hosts have failed 8119 1726773074.98473: getting the remaining hosts for this loop 8119 1726773074.98475: done getting the remaining hosts for this loop 8119 1726773074.98480: building list of next tasks for hosts 8119 1726773074.98482: getting the next task for host managed_node2 8119 1726773074.98491: done getting next task for host managed_node2 8119 1726773074.98496: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773074.98502: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.98505: done building task lists 8119 1726773074.98507: counting tasks in each state of execution 8119 1726773074.98511: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773074.98514: advancing hosts in ITERATING_TASKS 8119 1726773074.98516: starting to advance hosts 8119 1726773074.98518: getting the next task for host managed_node2 8119 1726773074.98522: done getting next task for host managed_node2 8119 1726773074.98525: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773074.98529: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773074.98531: done advancing hosts to next task 8119 1726773074.98546: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773074.98550: getting variables 8119 1726773074.98553: in VariableManager get_vars() 8119 1726773074.98591: Calling all_inventory to load vars for managed_node2 8119 1726773074.98598: Calling groups_inventory to load vars for managed_node2 8119 1726773074.98602: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773074.98631: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.98646: Calling all_plugins_play to load vars for managed_node2 8119 1726773074.98664: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.98678: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773074.98699: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.98711: Calling groups_plugins_play to load vars for managed_node2 8119 1726773074.98728: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.98759: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.98784: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773074.99125: done with get_vars() 8119 1726773074.99139: done getting variables 8119 1726773074.99146: sending task start callback, copying the task so we can template it temporarily 8119 1726773074.99148: done copying, going to template now 8119 1726773074.99152: done templating 8119 1726773074.99154: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.023) 0:01:09.548 **** 8119 1726773074.99177: sending task start callback 8119 1726773074.99180: entering _queue_task() for managed_node2/set_fact 8119 1726773074.99318: worker is 1 (out of 1 available) 8119 1726773074.99353: exiting _queue_task() for managed_node2/set_fact 8119 1726773074.99426: done queuing things up, now waiting for results queue to drain 8119 1726773074.99431: waiting for pending results... 10865 1726773074.99657: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 10865 1726773074.99720: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000002ff 10865 1726773074.99763: calling self._execute() 10865 1726773075.02164: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10865 1726773075.02248: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10865 1726773075.02317: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10865 1726773075.02344: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10865 1726773075.02370: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10865 1726773075.02402: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10865 1726773075.02448: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10865 1726773075.02470: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10865 1726773075.02489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10865 1726773075.02571: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10865 1726773075.02589: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10865 1726773075.02604: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10865 1726773075.02844: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10865 1726773075.02850: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10865 1726773075.02854: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10865 1726773075.02856: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10865 1726773075.02858: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10865 1726773075.02860: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10865 1726773075.02862: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10865 1726773075.02863: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10865 1726773075.02865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10865 1726773075.02886: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10865 1726773075.02890: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10865 1726773075.02892: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10865 1726773075.02933: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10865 1726773075.02966: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10865 1726773075.02978: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10865 1726773075.02994: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726773075.03001: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726773075.03097: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10865 1726773075.03106: starting attempt loop 10865 1726773075.03110: running the handler 10865 1726773075.03121: handler run complete 10865 1726773075.03125: attempt loop complete, returning result 10865 1726773075.03126: _execute() done 10865 1726773075.03128: dumping result to json 10865 1726773075.03130: done dumping result, returning 10865 1726773075.03134: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [12a3200b-1e9d-1dbd-cc52-0000000002ff] 10865 1726773075.03141: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ff 10865 1726773075.03168: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000002ff 10865 1726773075.03171: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8119 1726773075.03321: no more pending results, returning what we have 8119 1726773075.03327: results queue empty 8119 1726773075.03329: checking for any_errors_fatal 8119 1726773075.03333: done checking for any_errors_fatal 8119 1726773075.03335: checking for max_fail_percentage 8119 1726773075.03338: done checking for max_fail_percentage 8119 1726773075.03340: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.03342: done checking to see if all hosts have failed 8119 1726773075.03344: getting the remaining hosts for this loop 8119 1726773075.03346: done getting the remaining hosts for this loop 8119 1726773075.03354: building list of next tasks for hosts 8119 1726773075.03356: getting the next task for host managed_node2 8119 1726773075.03365: done getting next task for host managed_node2 8119 1726773075.03368: ^ task is: TASK: Force handlers 8119 1726773075.03371: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.03374: done building task lists 8119 1726773075.03375: counting tasks in each state of execution 8119 1726773075.03379: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.03381: advancing hosts in ITERATING_TASKS 8119 1726773075.03386: starting to advance hosts 8119 1726773075.03388: getting the next task for host managed_node2 8119 1726773075.03393: done getting next task for host managed_node2 8119 1726773075.03395: ^ task is: TASK: Force handlers 8119 1726773075.03397: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.03399: done advancing hosts to next task META: ran handlers 8119 1726773075.03428: done queuing things up, now waiting for results queue to drain 8119 1726773075.03431: results queue empty 8119 1726773075.03433: checking for any_errors_fatal 8119 1726773075.03436: done checking for any_errors_fatal 8119 1726773075.03438: checking for max_fail_percentage 8119 1726773075.03440: done checking for max_fail_percentage 8119 1726773075.03442: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.03443: done checking to see if all hosts have failed 8119 1726773075.03445: getting the remaining hosts for this loop 8119 1726773075.03448: done getting the remaining hosts for this loop 8119 1726773075.03453: building list of next tasks for hosts 8119 1726773075.03456: getting the next task for host managed_node2 8119 1726773075.03459: done getting next task for host managed_node2 8119 1726773075.03462: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773075.03464: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.03467: done building task lists 8119 1726773075.03468: counting tasks in each state of execution 8119 1726773075.03470: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.03472: advancing hosts in ITERATING_TASKS 8119 1726773075.03473: starting to advance hosts 8119 1726773075.03474: getting the next task for host managed_node2 8119 1726773075.03476: done getting next task for host managed_node2 8119 1726773075.03478: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773075.03479: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.03481: done advancing hosts to next task 8119 1726773075.03490: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.03493: getting variables 8119 1726773075.03495: in VariableManager get_vars() 8119 1726773075.03522: Calling all_inventory to load vars for managed_node2 8119 1726773075.03526: Calling groups_inventory to load vars for managed_node2 8119 1726773075.03528: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.03552: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03569: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.03587: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03603: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.03622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03633: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.03654: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03674: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03694: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.03961: done with get_vars() 8119 1726773075.03975: done getting variables 8119 1726773075.03981: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.03986: done copying, going to template now 8119 1726773075.03989: done templating 8119 1726773075.03991: here goes the callback... TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:133 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.048) 0:01:09.596 **** 8119 1726773075.04016: sending task start callback 8119 1726773075.04019: entering _queue_task() for managed_node2/assert 8119 1726773075.04161: worker is 1 (out of 1 available) 8119 1726773075.04194: exiting _queue_task() for managed_node2/assert 8119 1726773075.04270: done queuing things up, now waiting for results queue to drain 8119 1726773075.04275: waiting for pending results... 10870 1726773075.04375: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 10870 1726773075.04427: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000020 10870 1726773075.04478: calling self._execute() 10870 1726773075.04734: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10870 1726773075.04789: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10870 1726773075.04804: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10870 1726773075.04820: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10870 1726773075.04829: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10870 1726773075.04985: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10870 1726773075.05011: starting attempt loop 10870 1726773075.05015: running the handler 10870 1726773075.06610: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10870 1726773075.06690: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10870 1726773075.06754: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10870 1726773075.06786: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10870 1726773075.06814: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10870 1726773075.06845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10870 1726773075.06886: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10870 1726773075.06912: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10870 1726773075.06930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10870 1726773075.07007: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10870 1726773075.07025: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10870 1726773075.07039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10870 1726773075.07360: handler run complete 10870 1726773075.07366: attempt loop complete, returning result 10870 1726773075.07370: _execute() done 10870 1726773075.07372: dumping result to json 10870 1726773075.07375: done dumping result, returning 10870 1726773075.07380: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [12a3200b-1e9d-1dbd-cc52-000000000020] 10870 1726773075.07391: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000020 10870 1726773075.07418: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000020 10870 1726773075.07422: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773075.07884: no more pending results, returning what we have 8119 1726773075.07890: results queue empty 8119 1726773075.07892: checking for any_errors_fatal 8119 1726773075.07895: done checking for any_errors_fatal 8119 1726773075.07897: checking for max_fail_percentage 8119 1726773075.07901: done checking for max_fail_percentage 8119 1726773075.07903: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.07905: done checking to see if all hosts have failed 8119 1726773075.07906: getting the remaining hosts for this loop 8119 1726773075.07909: done getting the remaining hosts for this loop 8119 1726773075.07916: building list of next tasks for hosts 8119 1726773075.07919: getting the next task for host managed_node2 8119 1726773075.07926: done getting next task for host managed_node2 8119 1726773075.07928: ^ task is: TASK: Ensure role reported changed 8119 1726773075.07932: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.07934: done building task lists 8119 1726773075.07935: counting tasks in each state of execution 8119 1726773075.07939: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.07942: advancing hosts in ITERATING_TASKS 8119 1726773075.07944: starting to advance hosts 8119 1726773075.07946: getting the next task for host managed_node2 8119 1726773075.07949: done getting next task for host managed_node2 8119 1726773075.07951: ^ task is: TASK: Ensure role reported changed 8119 1726773075.07953: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.07955: done advancing hosts to next task 8119 1726773075.07970: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.07974: getting variables 8119 1726773075.07977: in VariableManager get_vars() 8119 1726773075.08016: Calling all_inventory to load vars for managed_node2 8119 1726773075.08024: Calling groups_inventory to load vars for managed_node2 8119 1726773075.08029: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.08059: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08074: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.08093: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08110: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.08128: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08139: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.08156: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08187: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08209: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.08488: done with get_vars() 8119 1726773075.08498: done getting variables 8119 1726773075.08503: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.08505: done copying, going to template now 8119 1726773075.08508: done templating 8119 1726773075.08510: here goes the callback... TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:137 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.045) 0:01:09.641 **** 8119 1726773075.08526: sending task start callback 8119 1726773075.08528: entering _queue_task() for managed_node2/assert 8119 1726773075.08641: worker is 1 (out of 1 available) 8119 1726773075.08681: exiting _queue_task() for managed_node2/assert 8119 1726773075.08754: done queuing things up, now waiting for results queue to drain 8119 1726773075.08759: waiting for pending results... 10874 1726773075.08968: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 10874 1726773075.09022: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000021 10874 1726773075.09081: calling self._execute() 10874 1726773075.09255: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10874 1726773075.09297: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10874 1726773075.09309: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10874 1726773075.09320: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10874 1726773075.09333: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10874 1726773075.09454: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10874 1726773075.09474: starting attempt loop 10874 1726773075.09476: running the handler 10874 1726773075.11032: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10874 1726773075.11150: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10874 1726773075.11209: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10874 1726773075.11237: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10874 1726773075.11261: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10874 1726773075.11292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10874 1726773075.11337: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10874 1726773075.11360: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10874 1726773075.11376: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10874 1726773075.11456: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10874 1726773075.11472: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10874 1726773075.11488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10874 1726773075.11808: handler run complete 10874 1726773075.11814: attempt loop complete, returning result 10874 1726773075.11817: _execute() done 10874 1726773075.11820: dumping result to json 10874 1726773075.11822: done dumping result, returning 10874 1726773075.11827: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [12a3200b-1e9d-1dbd-cc52-000000000021] 10874 1726773075.11841: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000021 10874 1726773075.11867: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000021 10874 1726773075.11870: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773075.12022: no more pending results, returning what we have 8119 1726773075.12026: results queue empty 8119 1726773075.12029: checking for any_errors_fatal 8119 1726773075.12033: done checking for any_errors_fatal 8119 1726773075.12036: checking for max_fail_percentage 8119 1726773075.12039: done checking for max_fail_percentage 8119 1726773075.12041: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.12043: done checking to see if all hosts have failed 8119 1726773075.12045: getting the remaining hosts for this loop 8119 1726773075.12048: done getting the remaining hosts for this loop 8119 1726773075.12055: building list of next tasks for hosts 8119 1726773075.12058: getting the next task for host managed_node2 8119 1726773075.12064: done getting next task for host managed_node2 8119 1726773075.12067: ^ task is: TASK: Check sysctl after reboot 8119 1726773075.12070: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.12072: done building task lists 8119 1726773075.12074: counting tasks in each state of execution 8119 1726773075.12078: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.12080: advancing hosts in ITERATING_TASKS 8119 1726773075.12084: starting to advance hosts 8119 1726773075.12087: getting the next task for host managed_node2 8119 1726773075.12090: done getting next task for host managed_node2 8119 1726773075.12093: ^ task is: TASK: Check sysctl after reboot 8119 1726773075.12095: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.12098: done advancing hosts to next task 8119 1726773075.12115: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.12119: getting variables 8119 1726773075.12122: in VariableManager get_vars() 8119 1726773075.12181: Calling all_inventory to load vars for managed_node2 8119 1726773075.12191: Calling groups_inventory to load vars for managed_node2 8119 1726773075.12195: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.12226: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12240: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.12256: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12269: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.12295: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12306: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.12327: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12356: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12378: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.12725: done with get_vars() 8119 1726773075.12736: done getting variables 8119 1726773075.12740: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.12742: done copying, going to template now 8119 1726773075.12744: done templating 8119 1726773075.12745: here goes the callback... TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:141 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.042) 0:01:09.684 **** 8119 1726773075.12763: sending task start callback 8119 1726773075.12765: entering _queue_task() for managed_node2/shell 8119 1726773075.12889: worker is 1 (out of 1 available) 8119 1726773075.12928: exiting _queue_task() for managed_node2/shell 8119 1726773075.13000: done queuing things up, now waiting for results queue to drain 8119 1726773075.13006: waiting for pending results... 10876 1726773075.13069: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10876 1726773075.13122: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000022 10876 1726773075.13176: calling self._execute() 10876 1726773075.13320: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10876 1726773075.13359: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10876 1726773075.13370: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10876 1726773075.13380: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10876 1726773075.13389: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10876 1726773075.13549: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10876 1726773075.13575: starting attempt loop 10876 1726773075.13579: running the handler 10876 1726773075.13590: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10876 1726773075.13607: _low_level_execute_command(): starting 10876 1726773075.13614: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10876 1726773075.16284: stdout chunk (state=2): >>>/root <<< 10876 1726773075.16399: stderr chunk (state=3): >>><<< 10876 1726773075.16404: stdout chunk (state=3): >>><<< 10876 1726773075.16438: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10876 1726773075.16458: _low_level_execute_command(): starting 10876 1726773075.16466: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222 `" && echo ansible-tmp-1726773075.164504-10876-9352198389222="` echo /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222 `" ) && sleep 0' 10876 1726773075.19506: stdout chunk (state=2): >>>ansible-tmp-1726773075.164504-10876-9352198389222=/root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222 <<< 10876 1726773075.19626: stderr chunk (state=3): >>><<< 10876 1726773075.19633: stdout chunk (state=3): >>><<< 10876 1726773075.19655: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.164504-10876-9352198389222=/root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222 , stderr= 10876 1726773075.19775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 10876 1726773075.19834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/AnsiballZ_command.py 10876 1726773075.20136: Sending initial data 10876 1726773075.20150: Sent initial data (152 bytes) 10876 1726773075.22565: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp6qtsvm6r /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/AnsiballZ_command.py <<< 10876 1726773075.23576: stderr chunk (state=3): >>><<< 10876 1726773075.23581: stdout chunk (state=3): >>><<< 10876 1726773075.23605: done transferring module to remote 10876 1726773075.23622: _low_level_execute_command(): starting 10876 1726773075.23627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/ /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/AnsiballZ_command.py && sleep 0' 10876 1726773075.26146: stderr chunk (state=2): >>><<< 10876 1726773075.26157: stdout chunk (state=2): >>><<< 10876 1726773075.26179: _low_level_execute_command() done: rc=0, stdout=, stderr= 10876 1726773075.26185: _low_level_execute_command(): starting 10876 1726773075.26193: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/AnsiballZ_command.py && sleep 0' 10876 1726773075.41173: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:15.404695", "end": "2024-09-19 15:11:15.410084", "delta": "0:00:00.005389", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10876 1726773075.42195: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10876 1726773075.42248: stderr chunk (state=3): >>><<< 10876 1726773075.42254: stdout chunk (state=3): >>><<< 10876 1726773075.42275: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:15.404695", "end": "2024-09-19 15:11:15.410084", "delta": "0:00:00.005389", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10876 1726773075.42307: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10876 1726773075.42321: _low_level_execute_command(): starting 10876 1726773075.42327: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.164504-10876-9352198389222/ > /dev/null 2>&1 && sleep 0' 10876 1726773075.45195: stderr chunk (state=2): >>><<< 10876 1726773075.45209: stdout chunk (state=2): >>><<< 10876 1726773075.45235: _low_level_execute_command() done: rc=0, stdout=, stderr= 10876 1726773075.45246: handler run complete 10876 1726773075.45257: attempt loop complete, returning result 10876 1726773075.45273: _execute() done 10876 1726773075.45275: dumping result to json 10876 1726773075.45280: done dumping result, returning 10876 1726773075.45294: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [12a3200b-1e9d-1dbd-cc52-000000000022] 10876 1726773075.45309: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000022 10876 1726773075.45399: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000022 10876 1726773075.45405: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "delta": "0:00:00.005389", "end": "2024-09-19 15:11:15.410084", "rc": 0, "start": "2024-09-19 15:11:15.404695" } 8119 1726773075.45805: no more pending results, returning what we have 8119 1726773075.45809: results queue empty 8119 1726773075.45814: checking for any_errors_fatal 8119 1726773075.45817: done checking for any_errors_fatal 8119 1726773075.45819: checking for max_fail_percentage 8119 1726773075.45821: done checking for max_fail_percentage 8119 1726773075.45823: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.45827: done checking to see if all hosts have failed 8119 1726773075.45829: getting the remaining hosts for this loop 8119 1726773075.45831: done getting the remaining hosts for this loop 8119 1726773075.45838: building list of next tasks for hosts 8119 1726773075.45840: getting the next task for host managed_node2 8119 1726773075.45845: done getting next task for host managed_node2 8119 1726773075.45847: ^ task is: TASK: Apply kernel_settings for removing 8119 1726773075.45851: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.45853: done building task lists 8119 1726773075.45855: counting tasks in each state of execution 8119 1726773075.45858: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.45860: advancing hosts in ITERATING_TASKS 8119 1726773075.45862: starting to advance hosts 8119 1726773075.45864: getting the next task for host managed_node2 8119 1726773075.45867: done getting next task for host managed_node2 8119 1726773075.45869: ^ task is: TASK: Apply kernel_settings for removing 8119 1726773075.45871: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.45873: done advancing hosts to next task 8119 1726773075.45885: getting variables 8119 1726773075.45888: in VariableManager get_vars() 8119 1726773075.45916: Calling all_inventory to load vars for managed_node2 8119 1726773075.45923: Calling groups_inventory to load vars for managed_node2 8119 1726773075.45927: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.45976: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.45994: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.46016: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.46032: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.46051: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.46063: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.46078: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.46114: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.46140: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.46459: done with get_vars() 8119 1726773075.46473: done getting variables 8119 1726773075.46479: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.46481: done copying, going to template now 8119 1726773075.46487: done templating 8119 1726773075.46489: here goes the callback... TASK [Apply kernel_settings for removing] ************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:147 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.337) 0:01:10.021 **** 8119 1726773075.46515: sending task start callback 8119 1726773075.46518: entering _queue_task() for managed_node2/include_role 8119 1726773075.46689: worker is 1 (out of 1 available) 8119 1726773075.46731: exiting _queue_task() for managed_node2/include_role 8119 1726773075.46816: done queuing things up, now waiting for results queue to drain 8119 1726773075.46822: waiting for pending results... 10894 1726773075.47090: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing 10894 1726773075.47152: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000023 10894 1726773075.47214: calling self._execute() 10894 1726773075.47350: _execute() done 10894 1726773075.47356: dumping result to json 10894 1726773075.47360: done dumping result, returning 10894 1726773075.47365: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing [12a3200b-1e9d-1dbd-cc52-000000000023] 10894 1726773075.47378: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000023 10894 1726773075.47697: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000023 10894 1726773075.47702: WORKER PROCESS EXITING 8119 1726773075.48195: no more pending results, returning what we have 8119 1726773075.48204: in VariableManager get_vars() 8119 1726773075.48245: Calling all_inventory to load vars for managed_node2 8119 1726773075.48251: Calling groups_inventory to load vars for managed_node2 8119 1726773075.48255: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.48288: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48304: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.48324: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48339: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.48358: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48369: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.48389: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48421: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48445: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.48796: done with get_vars() 8119 1726773075.50942: we have included files to process 8119 1726773075.50949: generating all_blocks data 8119 1726773075.50953: done generating all_blocks data 8119 1726773075.50958: processing included file: fedora.linux_system_roles.kernel_settings 8119 1726773075.50982: in VariableManager get_vars() 8119 1726773075.51017: done with get_vars() 8119 1726773075.51093: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8119 1726773075.51185: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8119 1726773075.51221: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8119 1726773075.51313: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8119 1726773075.51867: in VariableManager get_vars() 8119 1726773075.51904: done with get_vars() 8119 1726773075.52138: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.52201: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.52355: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.52413: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.52607: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.52822: in VariableManager get_vars() 8119 1726773075.52855: done with get_vars() 8119 1726773075.52958: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8119 1726773075.53353: iterating over new_blocks loaded from include file 8119 1726773075.53359: in VariableManager get_vars() 8119 1726773075.53380: done with get_vars() 8119 1726773075.53385: filtering new block on tags 8119 1726773075.53445: done filtering new block on tags 8119 1726773075.53458: in VariableManager get_vars() 8119 1726773075.53478: done with get_vars() 8119 1726773075.53481: filtering new block on tags 8119 1726773075.53522: done filtering new block on tags 8119 1726773075.53530: in VariableManager get_vars() 8119 1726773075.53544: done with get_vars() 8119 1726773075.53546: filtering new block on tags 8119 1726773075.53667: done filtering new block on tags 8119 1726773075.53676: done iterating over new_blocks loaded from include file 8119 1726773075.53679: extending task lists for all hosts with included blocks 8119 1726773075.56001: done extending task lists 8119 1726773075.56008: done processing included files 8119 1726773075.56010: results queue empty 8119 1726773075.56012: checking for any_errors_fatal 8119 1726773075.56018: done checking for any_errors_fatal 8119 1726773075.56020: checking for max_fail_percentage 8119 1726773075.56023: done checking for max_fail_percentage 8119 1726773075.56025: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.56027: done checking to see if all hosts have failed 8119 1726773075.56029: getting the remaining hosts for this loop 8119 1726773075.56032: done getting the remaining hosts for this loop 8119 1726773075.56039: building list of next tasks for hosts 8119 1726773075.56041: getting the next task for host managed_node2 8119 1726773075.56047: done getting next task for host managed_node2 8119 1726773075.56051: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773075.56056: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.56058: done building task lists 8119 1726773075.56060: counting tasks in each state of execution 8119 1726773075.56063: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.56066: advancing hosts in ITERATING_TASKS 8119 1726773075.56068: starting to advance hosts 8119 1726773075.56070: getting the next task for host managed_node2 8119 1726773075.56075: done getting next task for host managed_node2 8119 1726773075.56078: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773075.56081: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.56086: done advancing hosts to next task 8119 1726773075.56097: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.56102: getting variables 8119 1726773075.56104: in VariableManager get_vars() 8119 1726773075.56130: Calling all_inventory to load vars for managed_node2 8119 1726773075.56137: Calling groups_inventory to load vars for managed_node2 8119 1726773075.56141: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.56162: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56174: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.56190: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56203: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.56218: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56226: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.56240: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56264: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56282: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.56566: done with get_vars() 8119 1726773075.56579: done getting variables 8119 1726773075.56586: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.56588: done copying, going to template now 8119 1726773075.56591: done templating 8119 1726773075.56593: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.101) 0:01:10.122 **** 8119 1726773075.56615: sending task start callback 8119 1726773075.56618: entering _queue_task() for managed_node2/fail 8119 1726773075.56792: worker is 1 (out of 1 available) 8119 1726773075.56826: exiting _queue_task() for managed_node2/fail 8119 1726773075.56893: done queuing things up, now waiting for results queue to drain 8119 1726773075.56898: waiting for pending results... 10900 1726773075.57110: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10900 1726773075.57181: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e2 10900 1726773075.57238: calling self._execute() 10900 1726773075.59699: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10900 1726773075.59812: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10900 1726773075.59895: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10900 1726773075.59937: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10900 1726773075.59976: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10900 1726773075.60017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10900 1726773075.60078: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10900 1726773075.60111: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10900 1726773075.60137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10900 1726773075.60258: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10900 1726773075.60285: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10900 1726773075.60309: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10900 1726773075.61194: when evaluation is False, skipping this task 10900 1726773075.61200: _execute() done 10900 1726773075.61202: dumping result to json 10900 1726773075.61205: done dumping result, returning 10900 1726773075.61211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [12a3200b-1e9d-1dbd-cc52-0000000005e2] 10900 1726773075.61222: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e2 10900 1726773075.61264: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e2 10900 1726773075.61268: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.61649: no more pending results, returning what we have 8119 1726773075.61655: results queue empty 8119 1726773075.61659: checking for any_errors_fatal 8119 1726773075.61667: done checking for any_errors_fatal 8119 1726773075.61669: checking for max_fail_percentage 8119 1726773075.61673: done checking for max_fail_percentage 8119 1726773075.61675: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.61677: done checking to see if all hosts have failed 8119 1726773075.61680: getting the remaining hosts for this loop 8119 1726773075.61684: done getting the remaining hosts for this loop 8119 1726773075.61694: building list of next tasks for hosts 8119 1726773075.61697: getting the next task for host managed_node2 8119 1726773075.61706: done getting next task for host managed_node2 8119 1726773075.61712: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773075.61717: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.61720: done building task lists 8119 1726773075.61722: counting tasks in each state of execution 8119 1726773075.61726: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.61729: advancing hosts in ITERATING_TASKS 8119 1726773075.61731: starting to advance hosts 8119 1726773075.61734: getting the next task for host managed_node2 8119 1726773075.61739: done getting next task for host managed_node2 8119 1726773075.61742: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773075.61745: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.61748: done advancing hosts to next task 8119 1726773075.61765: getting variables 8119 1726773075.61769: in VariableManager get_vars() 8119 1726773075.61813: Calling all_inventory to load vars for managed_node2 8119 1726773075.61821: Calling groups_inventory to load vars for managed_node2 8119 1726773075.61825: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.61856: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.61873: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.61893: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.61909: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.61930: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.61941: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.61957: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.61991: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.62016: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.62393: done with get_vars() 8119 1726773075.62406: done getting variables 8119 1726773075.62413: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.62416: done copying, going to template now 8119 1726773075.62419: done templating 8119 1726773075.62421: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.058) 0:01:10.180 **** 8119 1726773075.62442: sending task start callback 8119 1726773075.62445: entering _queue_task() for managed_node2/include_tasks 8119 1726773075.62641: worker is 1 (out of 1 available) 8119 1726773075.62676: exiting _queue_task() for managed_node2/include_tasks 8119 1726773075.62748: done queuing things up, now waiting for results queue to drain 8119 1726773075.62753: waiting for pending results... 10908 1726773075.62971: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10908 1726773075.63045: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e3 10908 1726773075.63099: calling self._execute() 10908 1726773075.63233: _execute() done 10908 1726773075.63239: dumping result to json 10908 1726773075.63242: done dumping result, returning 10908 1726773075.63248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [12a3200b-1e9d-1dbd-cc52-0000000005e3] 10908 1726773075.63261: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e3 10908 1726773075.63598: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e3 10908 1726773075.63603: WORKER PROCESS EXITING 8119 1726773075.63996: no more pending results, returning what we have 8119 1726773075.64005: in VariableManager get_vars() 8119 1726773075.64049: Calling all_inventory to load vars for managed_node2 8119 1726773075.64055: Calling groups_inventory to load vars for managed_node2 8119 1726773075.64060: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.64093: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64112: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.64131: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64146: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.64164: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64175: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.64193: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64228: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64252: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.64602: done with get_vars() 8119 1726773075.64663: we have included files to process 8119 1726773075.64667: generating all_blocks data 8119 1726773075.64671: done generating all_blocks data 8119 1726773075.64676: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773075.64679: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773075.64686: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773075.64921: plugin lookup for setup failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.65014: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773075.65137: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8119 1726773075.65306: done processing included file 8119 1726773075.65309: iterating over new_blocks loaded from include file 8119 1726773075.65315: in VariableManager get_vars() 8119 1726773075.65343: done with get_vars() 8119 1726773075.65347: filtering new block on tags 8119 1726773075.65414: done filtering new block on tags 8119 1726773075.65428: in VariableManager get_vars() 8119 1726773075.65453: done with get_vars() 8119 1726773075.65457: filtering new block on tags 8119 1726773075.65520: done filtering new block on tags 8119 1726773075.65531: in VariableManager get_vars() 8119 1726773075.65558: done with get_vars() 8119 1726773075.65563: filtering new block on tags 8119 1726773075.65643: done filtering new block on tags 8119 1726773075.65656: in VariableManager get_vars() 8119 1726773075.65686: done with get_vars() 8119 1726773075.65691: filtering new block on tags 8119 1726773075.65771: done filtering new block on tags 8119 1726773075.65785: done iterating over new_blocks loaded from include file 8119 1726773075.65789: extending task lists for all hosts with included blocks 8119 1726773075.65926: done extending task lists 8119 1726773075.65931: done processing included files 8119 1726773075.65934: results queue empty 8119 1726773075.65936: checking for any_errors_fatal 8119 1726773075.65941: done checking for any_errors_fatal 8119 1726773075.65943: checking for max_fail_percentage 8119 1726773075.65946: done checking for max_fail_percentage 8119 1726773075.65948: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.65950: done checking to see if all hosts have failed 8119 1726773075.65952: getting the remaining hosts for this loop 8119 1726773075.65955: done getting the remaining hosts for this loop 8119 1726773075.65961: building list of next tasks for hosts 8119 1726773075.65964: getting the next task for host managed_node2 8119 1726773075.65971: done getting next task for host managed_node2 8119 1726773075.65975: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773075.65980: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.65984: done building task lists 8119 1726773075.65987: counting tasks in each state of execution 8119 1726773075.65991: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.65993: advancing hosts in ITERATING_TASKS 8119 1726773075.65996: starting to advance hosts 8119 1726773075.65998: getting the next task for host managed_node2 8119 1726773075.66003: done getting next task for host managed_node2 8119 1726773075.66006: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773075.66013: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.66016: done advancing hosts to next task 8119 1726773075.66025: getting variables 8119 1726773075.66028: in VariableManager get_vars() 8119 1726773075.66046: Calling all_inventory to load vars for managed_node2 8119 1726773075.66051: Calling groups_inventory to load vars for managed_node2 8119 1726773075.66055: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.66075: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66089: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.66108: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66125: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.66144: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66155: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.66172: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66205: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66234: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.66544: done with get_vars() 8119 1726773075.66556: done getting variables 8119 1726773075.66561: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.66564: done copying, going to template now 8119 1726773075.66566: done templating 8119 1726773075.66568: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.041) 0:01:10.222 **** 8119 1726773075.66591: sending task start callback 8119 1726773075.66593: entering _queue_task() for managed_node2/setup 8119 1726773075.66752: worker is 1 (out of 1 available) 8119 1726773075.66786: exiting _queue_task() for managed_node2/setup 8119 1726773075.66854: done queuing things up, now waiting for results queue to drain 8119 1726773075.66858: waiting for pending results... 10911 1726773075.67067: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10911 1726773075.67142: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000759 10911 1726773075.67193: calling self._execute() 10911 1726773075.69664: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10911 1726773075.69785: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10911 1726773075.69871: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10911 1726773075.69918: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10911 1726773075.69960: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10911 1726773075.70002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10911 1726773075.70067: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10911 1726773075.70116: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10911 1726773075.70143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10911 1726773075.70258: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10911 1726773075.70280: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10911 1726773075.70300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10911 1726773075.70855: when evaluation is False, skipping this task 10911 1726773075.70862: _execute() done 10911 1726773075.70864: dumping result to json 10911 1726773075.70867: done dumping result, returning 10911 1726773075.70873: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [12a3200b-1e9d-1dbd-cc52-000000000759] 10911 1726773075.70886: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000759 10911 1726773075.70929: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000759 10911 1726773075.70934: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.71239: no more pending results, returning what we have 8119 1726773075.71243: results queue empty 8119 1726773075.71245: checking for any_errors_fatal 8119 1726773075.71248: done checking for any_errors_fatal 8119 1726773075.71249: checking for max_fail_percentage 8119 1726773075.71252: done checking for max_fail_percentage 8119 1726773075.71253: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.71254: done checking to see if all hosts have failed 8119 1726773075.71255: getting the remaining hosts for this loop 8119 1726773075.71257: done getting the remaining hosts for this loop 8119 1726773075.71263: building list of next tasks for hosts 8119 1726773075.71265: getting the next task for host managed_node2 8119 1726773075.71273: done getting next task for host managed_node2 8119 1726773075.71277: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773075.71281: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.71284: done building task lists 8119 1726773075.71287: counting tasks in each state of execution 8119 1726773075.71291: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.71293: advancing hosts in ITERATING_TASKS 8119 1726773075.71295: starting to advance hosts 8119 1726773075.71297: getting the next task for host managed_node2 8119 1726773075.71302: done getting next task for host managed_node2 8119 1726773075.71304: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773075.71307: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.71309: done advancing hosts to next task 8119 1726773075.71321: getting variables 8119 1726773075.71324: in VariableManager get_vars() 8119 1726773075.71353: Calling all_inventory to load vars for managed_node2 8119 1726773075.71356: Calling groups_inventory to load vars for managed_node2 8119 1726773075.71359: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.71379: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71392: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.71406: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71421: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.71439: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71447: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.71456: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71474: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71492: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.71736: done with get_vars() 8119 1726773075.71747: done getting variables 8119 1726773075.71752: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.71755: done copying, going to template now 8119 1726773075.71758: done templating 8119 1726773075.71760: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.051) 0:01:10.274 **** 8119 1726773075.71777: sending task start callback 8119 1726773075.71779: entering _queue_task() for managed_node2/stat 8119 1726773075.71901: worker is 1 (out of 1 available) 8119 1726773075.71938: exiting _queue_task() for managed_node2/stat 8119 1726773075.72010: done queuing things up, now waiting for results queue to drain 8119 1726773075.72016: waiting for pending results... 10914 1726773075.72079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10914 1726773075.72139: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000075b 10914 1726773075.72187: calling self._execute() 10914 1726773075.74060: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10914 1726773075.74145: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10914 1726773075.74196: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10914 1726773075.74230: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10914 1726773075.74260: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10914 1726773075.74288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10914 1726773075.74350: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10914 1726773075.74374: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10914 1726773075.74399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10914 1726773075.74498: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10914 1726773075.74524: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10914 1726773075.74547: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10914 1726773075.74887: when evaluation is False, skipping this task 10914 1726773075.74892: _execute() done 10914 1726773075.74897: dumping result to json 10914 1726773075.74899: done dumping result, returning 10914 1726773075.74905: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [12a3200b-1e9d-1dbd-cc52-00000000075b] 10914 1726773075.74918: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075b 10914 1726773075.74950: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075b 10914 1726773075.74954: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.75161: no more pending results, returning what we have 8119 1726773075.75166: results queue empty 8119 1726773075.75169: checking for any_errors_fatal 8119 1726773075.75173: done checking for any_errors_fatal 8119 1726773075.75175: checking for max_fail_percentage 8119 1726773075.75178: done checking for max_fail_percentage 8119 1726773075.75180: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.75182: done checking to see if all hosts have failed 8119 1726773075.75186: getting the remaining hosts for this loop 8119 1726773075.75189: done getting the remaining hosts for this loop 8119 1726773075.75196: building list of next tasks for hosts 8119 1726773075.75199: getting the next task for host managed_node2 8119 1726773075.75207: done getting next task for host managed_node2 8119 1726773075.75212: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773075.75217: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.75221: done building task lists 8119 1726773075.75223: counting tasks in each state of execution 8119 1726773075.75227: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.75230: advancing hosts in ITERATING_TASKS 8119 1726773075.75232: starting to advance hosts 8119 1726773075.75235: getting the next task for host managed_node2 8119 1726773075.75240: done getting next task for host managed_node2 8119 1726773075.75243: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773075.75247: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.75252: done advancing hosts to next task 8119 1726773075.75264: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.75268: getting variables 8119 1726773075.75271: in VariableManager get_vars() 8119 1726773075.75308: Calling all_inventory to load vars for managed_node2 8119 1726773075.75317: Calling groups_inventory to load vars for managed_node2 8119 1726773075.75321: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.75343: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75355: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.75365: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75374: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.75386: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75394: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.75403: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75422: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75443: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.75653: done with get_vars() 8119 1726773075.75665: done getting variables 8119 1726773075.75670: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.75672: done copying, going to template now 8119 1726773075.75674: done templating 8119 1726773075.75675: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.039) 0:01:10.313 **** 8119 1726773075.75698: sending task start callback 8119 1726773075.75700: entering _queue_task() for managed_node2/set_fact 8119 1726773075.75813: worker is 1 (out of 1 available) 8119 1726773075.75849: exiting _queue_task() for managed_node2/set_fact 8119 1726773075.75919: done queuing things up, now waiting for results queue to drain 8119 1726773075.75925: waiting for pending results... 10917 1726773075.75991: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10917 1726773075.76052: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000075c 10917 1726773075.76096: calling self._execute() 10917 1726773075.78056: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10917 1726773075.78141: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10917 1726773075.78205: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10917 1726773075.78244: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10917 1726773075.78279: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10917 1726773075.78335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10917 1726773075.78401: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10917 1726773075.78437: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10917 1726773075.78464: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10917 1726773075.78589: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10917 1726773075.78615: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10917 1726773075.78636: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10917 1726773075.78974: when evaluation is False, skipping this task 10917 1726773075.78981: _execute() done 10917 1726773075.78985: dumping result to json 10917 1726773075.78988: done dumping result, returning 10917 1726773075.78994: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-00000000075c] 10917 1726773075.79007: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075c 10917 1726773075.79037: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075c 10917 1726773075.79041: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.79353: no more pending results, returning what we have 8119 1726773075.79358: results queue empty 8119 1726773075.79360: checking for any_errors_fatal 8119 1726773075.79364: done checking for any_errors_fatal 8119 1726773075.79366: checking for max_fail_percentage 8119 1726773075.79369: done checking for max_fail_percentage 8119 1726773075.79373: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.79375: done checking to see if all hosts have failed 8119 1726773075.79378: getting the remaining hosts for this loop 8119 1726773075.79380: done getting the remaining hosts for this loop 8119 1726773075.79391: building list of next tasks for hosts 8119 1726773075.79396: getting the next task for host managed_node2 8119 1726773075.79406: done getting next task for host managed_node2 8119 1726773075.79414: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773075.79419: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.79421: done building task lists 8119 1726773075.79422: counting tasks in each state of execution 8119 1726773075.79425: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.79427: advancing hosts in ITERATING_TASKS 8119 1726773075.79428: starting to advance hosts 8119 1726773075.79430: getting the next task for host managed_node2 8119 1726773075.79434: done getting next task for host managed_node2 8119 1726773075.79436: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773075.79439: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.79441: done advancing hosts to next task 8119 1726773075.79461: getting variables 8119 1726773075.79466: in VariableManager get_vars() 8119 1726773075.79504: Calling all_inventory to load vars for managed_node2 8119 1726773075.79509: Calling groups_inventory to load vars for managed_node2 8119 1726773075.79512: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.79534: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79545: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.79555: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79563: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.79574: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79579: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.79605: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79629: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79644: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.79857: done with get_vars() 8119 1726773075.79868: done getting variables 8119 1726773075.79873: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.79874: done copying, going to template now 8119 1726773075.79876: done templating 8119 1726773075.79877: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.041) 0:01:10.355 **** 8119 1726773075.79897: sending task start callback 8119 1726773075.79899: entering _queue_task() for managed_node2/stat 8119 1726773075.80024: worker is 1 (out of 1 available) 8119 1726773075.80061: exiting _queue_task() for managed_node2/stat 8119 1726773075.80130: done queuing things up, now waiting for results queue to drain 8119 1726773075.80136: waiting for pending results... 10921 1726773075.80204: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10921 1726773075.80266: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000075e 10921 1726773075.80313: calling self._execute() 10921 1726773075.82291: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10921 1726773075.82413: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10921 1726773075.82473: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10921 1726773075.82507: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10921 1726773075.82559: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10921 1726773075.82591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10921 1726773075.82647: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10921 1726773075.82670: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10921 1726773075.82689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10921 1726773075.82794: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10921 1726773075.82821: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10921 1726773075.82843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10921 1726773075.83171: when evaluation is False, skipping this task 10921 1726773075.83178: _execute() done 10921 1726773075.83180: dumping result to json 10921 1726773075.83184: done dumping result, returning 10921 1726773075.83191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [12a3200b-1e9d-1dbd-cc52-00000000075e] 10921 1726773075.83203: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075e 10921 1726773075.83243: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075e 10921 1726773075.83247: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.83676: no more pending results, returning what we have 8119 1726773075.83681: results queue empty 8119 1726773075.83685: checking for any_errors_fatal 8119 1726773075.83691: done checking for any_errors_fatal 8119 1726773075.83693: checking for max_fail_percentage 8119 1726773075.83697: done checking for max_fail_percentage 8119 1726773075.83699: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.83701: done checking to see if all hosts have failed 8119 1726773075.83703: getting the remaining hosts for this loop 8119 1726773075.83706: done getting the remaining hosts for this loop 8119 1726773075.83717: building list of next tasks for hosts 8119 1726773075.83720: getting the next task for host managed_node2 8119 1726773075.83729: done getting next task for host managed_node2 8119 1726773075.83735: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773075.83740: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.83743: done building task lists 8119 1726773075.83746: counting tasks in each state of execution 8119 1726773075.83750: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.83752: advancing hosts in ITERATING_TASKS 8119 1726773075.83755: starting to advance hosts 8119 1726773075.83757: getting the next task for host managed_node2 8119 1726773075.83763: done getting next task for host managed_node2 8119 1726773075.83767: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773075.83771: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.83773: done advancing hosts to next task 8119 1726773075.83792: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.83797: getting variables 8119 1726773075.83801: in VariableManager get_vars() 8119 1726773075.83843: Calling all_inventory to load vars for managed_node2 8119 1726773075.83850: Calling groups_inventory to load vars for managed_node2 8119 1726773075.83854: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.83882: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.83899: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.83916: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.83934: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.83947: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.83954: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.83964: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.83981: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.83999: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.84218: done with get_vars() 8119 1726773075.84228: done getting variables 8119 1726773075.84236: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.84239: done copying, going to template now 8119 1726773075.84241: done templating 8119 1726773075.84242: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.043) 0:01:10.399 **** 8119 1726773075.84258: sending task start callback 8119 1726773075.84260: entering _queue_task() for managed_node2/set_fact 8119 1726773075.84384: worker is 1 (out of 1 available) 8119 1726773075.84424: exiting _queue_task() for managed_node2/set_fact 8119 1726773075.84495: done queuing things up, now waiting for results queue to drain 8119 1726773075.84501: waiting for pending results... 10924 1726773075.84560: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10924 1726773075.84621: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000075f 10924 1726773075.84668: calling self._execute() 10924 1726773075.86539: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10924 1726773075.86627: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10924 1726773075.86679: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10924 1726773075.86724: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10924 1726773075.86996: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10924 1726773075.87030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10924 1726773075.87070: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10924 1726773075.87096: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10924 1726773075.87120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10924 1726773075.87195: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10924 1726773075.87221: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10924 1726773075.87237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10924 1726773075.87498: when evaluation is False, skipping this task 10924 1726773075.87504: _execute() done 10924 1726773075.87506: dumping result to json 10924 1726773075.87509: done dumping result, returning 10924 1726773075.87517: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [12a3200b-1e9d-1dbd-cc52-00000000075f] 10924 1726773075.87528: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075f 10924 1726773075.87553: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000075f 10924 1726773075.87556: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773075.87956: no more pending results, returning what we have 8119 1726773075.87961: results queue empty 8119 1726773075.87963: checking for any_errors_fatal 8119 1726773075.87970: done checking for any_errors_fatal 8119 1726773075.87972: checking for max_fail_percentage 8119 1726773075.87975: done checking for max_fail_percentage 8119 1726773075.87978: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.87980: done checking to see if all hosts have failed 8119 1726773075.87982: getting the remaining hosts for this loop 8119 1726773075.87986: done getting the remaining hosts for this loop 8119 1726773075.87994: building list of next tasks for hosts 8119 1726773075.87997: getting the next task for host managed_node2 8119 1726773075.88007: done getting next task for host managed_node2 8119 1726773075.88015: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773075.88021: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.88025: done building task lists 8119 1726773075.88027: counting tasks in each state of execution 8119 1726773075.88031: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.88033: advancing hosts in ITERATING_TASKS 8119 1726773075.88036: starting to advance hosts 8119 1726773075.88038: getting the next task for host managed_node2 8119 1726773075.88045: done getting next task for host managed_node2 8119 1726773075.88048: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773075.88052: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.88055: done advancing hosts to next task 8119 1726773075.88071: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.88078: getting variables 8119 1726773075.88082: in VariableManager get_vars() 8119 1726773075.88128: Calling all_inventory to load vars for managed_node2 8119 1726773075.88134: Calling groups_inventory to load vars for managed_node2 8119 1726773075.88138: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.88168: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88187: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.88211: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88224: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.88236: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88243: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.88256: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88291: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88309: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.88533: done with get_vars() 8119 1726773075.88544: done getting variables 8119 1726773075.88549: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.88551: done copying, going to template now 8119 1726773075.88553: done templating 8119 1726773075.88555: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.043) 0:01:10.442 **** 8119 1726773075.88571: sending task start callback 8119 1726773075.88573: entering _queue_task() for managed_node2/include_vars 8119 1726773075.88712: worker is 1 (out of 1 available) 8119 1726773075.88751: exiting _queue_task() for managed_node2/include_vars 8119 1726773075.88826: done queuing things up, now waiting for results queue to drain 8119 1726773075.88832: waiting for pending results... 10927 1726773075.88895: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10927 1726773075.88951: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000761 10927 1726773075.89002: calling self._execute() 10927 1726773075.90947: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10927 1726773075.91024: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10927 1726773075.91086: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10927 1726773075.91116: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10927 1726773075.91143: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10927 1726773075.91174: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10927 1726773075.91220: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10927 1726773075.91243: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10927 1726773075.91258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10927 1726773075.91340: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10927 1726773075.91357: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10927 1726773075.91370: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10927 1726773075.92101: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup 10927 1726773075.92236: Loaded config def from plugin (lookup/first_found) 10927 1726773075.92241: Loading LookupModule 'first_found' from /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup/first_found.py 10927 1726773075.92291: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10927 1726773075.92331: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10927 1726773075.92341: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10927 1726773075.92351: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10927 1726773075.92356: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10927 1726773075.92446: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10927 1726773075.92454: starting attempt loop 10927 1726773075.92456: running the handler 10927 1726773075.92497: handler run complete 10927 1726773075.92503: attempt loop complete, returning result 10927 1726773075.92505: _execute() done 10927 1726773075.92506: dumping result to json 10927 1726773075.92509: done dumping result, returning 10927 1726773075.92513: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [12a3200b-1e9d-1dbd-cc52-000000000761] 10927 1726773075.92520: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000761 10927 1726773075.92549: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000761 10927 1726773075.92553: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8119 1726773075.92759: no more pending results, returning what we have 8119 1726773075.92764: results queue empty 8119 1726773075.92766: checking for any_errors_fatal 8119 1726773075.92770: done checking for any_errors_fatal 8119 1726773075.92772: checking for max_fail_percentage 8119 1726773075.92775: done checking for max_fail_percentage 8119 1726773075.92777: checking to see if all hosts have failed and the running result is not ok 8119 1726773075.92779: done checking to see if all hosts have failed 8119 1726773075.92781: getting the remaining hosts for this loop 8119 1726773075.92786: done getting the remaining hosts for this loop 8119 1726773075.92793: building list of next tasks for hosts 8119 1726773075.92796: getting the next task for host managed_node2 8119 1726773075.92805: done getting next task for host managed_node2 8119 1726773075.92809: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773075.92815: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.92818: done building task lists 8119 1726773075.92819: counting tasks in each state of execution 8119 1726773075.92823: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773075.92825: advancing hosts in ITERATING_TASKS 8119 1726773075.92828: starting to advance hosts 8119 1726773075.92830: getting the next task for host managed_node2 8119 1726773075.92834: done getting next task for host managed_node2 8119 1726773075.92837: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773075.92840: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773075.92842: done advancing hosts to next task 8119 1726773075.92853: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773075.92857: getting variables 8119 1726773075.92859: in VariableManager get_vars() 8119 1726773075.92903: Calling all_inventory to load vars for managed_node2 8119 1726773075.92909: Calling groups_inventory to load vars for managed_node2 8119 1726773075.92914: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773075.92935: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.92945: Calling all_plugins_play to load vars for managed_node2 8119 1726773075.92955: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.92963: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773075.92973: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.92979: Calling groups_plugins_play to load vars for managed_node2 8119 1726773075.92991: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.93016: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.93032: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773075.93239: done with get_vars() 8119 1726773075.93250: done getting variables 8119 1726773075.93254: sending task start callback, copying the task so we can template it temporarily 8119 1726773075.93256: done copying, going to template now 8119 1726773075.93258: done templating 8119 1726773075.93259: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.047) 0:01:10.489 **** 8119 1726773075.93275: sending task start callback 8119 1726773075.93277: entering _queue_task() for managed_node2/package 8119 1726773075.93408: worker is 1 (out of 1 available) 8119 1726773075.93447: exiting _queue_task() for managed_node2/package 8119 1726773075.93524: done queuing things up, now waiting for results queue to drain 8119 1726773075.93529: waiting for pending results... 10929 1726773075.93585: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10929 1726773075.93635: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e4 10929 1726773075.93681: calling self._execute() 10929 1726773075.95624: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10929 1726773075.95708: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10929 1726773075.95776: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10929 1726773075.95815: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10929 1726773075.95863: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10929 1726773075.95908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10929 1726773075.95975: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10929 1726773075.96000: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10929 1726773075.96018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10929 1726773075.96101: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10929 1726773075.96119: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10929 1726773075.96133: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10929 1726773075.96278: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10929 1726773075.96285: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10929 1726773075.96290: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10929 1726773075.96293: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10929 1726773075.96297: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10929 1726773075.96300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10929 1726773075.96303: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10929 1726773075.96305: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10929 1726773075.96307: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10929 1726773075.96325: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10929 1726773075.96328: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10929 1726773075.96330: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10929 1726773075.96501: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10929 1726773075.96541: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10929 1726773075.96552: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10929 1726773075.96562: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10929 1726773075.96567: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10929 1726773075.96663: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10929 1726773075.96672: starting attempt loop 10929 1726773075.96674: running the handler 10929 1726773075.96792: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 10929 1726773075.96803: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 10929 1726773075.96810: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 10929 1726773075.96818: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 10929 1726773075.96826: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 10929 1726773075.96844: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 10929 1726773075.96860: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 10929 1726773075.96867: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 10929 1726773075.96874: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 10929 1726773075.96879: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 10929 1726773075.96885: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 10929 1726773075.96892: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 10929 1726773075.96904: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 10929 1726773075.96913: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 10929 1726773075.97020: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 10929 1726773075.97028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 10929 1726773075.97043: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 10929 1726773075.97048: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 10929 1726773075.97055: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 10929 1726773075.97125: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 10929 1726773075.97134: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 10929 1726773075.97234: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 10929 1726773075.97241: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 10929 1726773075.97247: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 10929 1726773075.97311: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 10929 1726773075.97319: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 10929 1726773075.97351: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 10929 1726773075.97362: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 10929 1726773075.97368: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 10929 1726773075.97374: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 10929 1726773075.97380: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 10929 1726773075.97389: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 10929 1726773075.97395: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 10929 1726773075.97401: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 10929 1726773075.97432: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 10929 1726773075.97441: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 10929 1726773075.97449: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 10929 1726773075.97615: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 10929 1726773075.97623: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 10929 1726773075.97627: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 10929 1726773075.97658: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 10929 1726773075.98120: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 10929 1726773075.98128: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 10929 1726773075.98136: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 10929 1726773075.98156: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 10929 1726773075.98170: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 10929 1726773075.98176: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 10929 1726773075.98182: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 10929 1726773075.98211: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 10929 1726773075.98234: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 10929 1726773075.98242: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 10929 1726773075.98247: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 10929 1726773075.98279: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 10929 1726773075.98287: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 10929 1726773075.98293: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 10929 1726773075.98314: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 10929 1726773075.98320: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 10929 1726773075.98326: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 10929 1726773075.98345: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 10929 1726773075.98398: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 10929 1726773075.98406: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 10929 1726773075.98415: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 10929 1726773075.98419: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 10929 1726773075.98503: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 10929 1726773075.98533: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 10929 1726773075.98538: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 10929 1726773075.98544: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 10929 1726773075.98550: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 10929 1726773075.98577: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 10929 1726773075.98585: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 10929 1726773075.98591: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 10929 1726773075.98599: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 10929 1726773075.98606: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 10929 1726773075.98611: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 10929 1726773075.98618: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 10929 1726773075.98630: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 10929 1726773075.98636: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 10929 1726773075.98643: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 10929 1726773075.98649: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 10929 1726773075.98670: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 10929 1726773075.98673: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 10929 1726773075.98680: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 10929 1726773075.98754: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 10929 1726773075.98761: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 10929 1726773075.98771: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 10929 1726773075.98775: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 10929 1726773075.98781: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 10929 1726773075.98828: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 10929 1726773075.98836: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 10929 1726773075.98897: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 10929 1726773075.98902: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 10929 1726773075.98906: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 10929 1726773075.98953: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 10929 1726773075.98959: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 10929 1726773075.98977: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 10929 1726773075.98987: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 10929 1726773075.98993: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 10929 1726773075.98998: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 10929 1726773075.99002: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 10929 1726773075.99007: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 10929 1726773075.99012: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 10929 1726773075.99017: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 10929 1726773075.99038: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 10929 1726773075.99042: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 10929 1726773075.99048: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 10929 1726773075.99164: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 10929 1726773075.99174: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 10929 1726773075.99178: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 10929 1726773075.99199: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 10929 1726773075.99459: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 10929 1726773075.99466: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 10929 1726773075.99471: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 10929 1726773075.99485: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 10929 1726773075.99495: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 10929 1726773075.99501: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 10929 1726773075.99509: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 10929 1726773075.99531: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 10929 1726773075.99546: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 10929 1726773075.99551: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 10929 1726773075.99555: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 10929 1726773075.99575: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 10929 1726773075.99579: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 10929 1726773075.99585: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 10929 1726773075.99600: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 10929 1726773075.99604: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 10929 1726773075.99609: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 10929 1726773075.99624: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 10929 1726773075.99659: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 10929 1726773075.99665: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 10929 1726773075.99671: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 10929 1726773075.99675: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 10929 1726773075.99730: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 10929 1726773075.99754: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 10929 1726773075.99759: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 10929 1726773075.99763: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 10929 1726773075.99768: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 10929 1726773075.99786: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 10929 1726773075.99791: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 10929 1726773075.99796: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 10929 1726773075.99800: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 10929 1726773075.99804: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 10929 1726773075.99808: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 10929 1726773075.99813: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 10929 1726773075.99821: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 10929 1726773075.99826: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 10929 1726773075.99832: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 10929 1726773075.99836: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 10929 1726773075.99856: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 10929 1726773075.99877: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 10929 1726773075.99894: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 10929 1726773075.99950: _low_level_execute_command(): starting 10929 1726773075.99956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10929 1726773076.02515: stdout chunk (state=2): >>>/root <<< 10929 1726773076.02693: stderr chunk (state=3): >>><<< 10929 1726773076.02699: stdout chunk (state=3): >>><<< 10929 1726773076.02725: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10929 1726773076.02744: _low_level_execute_command(): starting 10929 1726773076.02751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445 `" && echo ansible-tmp-1726773076.0273564-10929-154149910510445="` echo /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445 `" ) && sleep 0' 10929 1726773076.06015: stdout chunk (state=2): >>>ansible-tmp-1726773076.0273564-10929-154149910510445=/root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445 <<< 10929 1726773076.06451: stderr chunk (state=3): >>><<< 10929 1726773076.06459: stdout chunk (state=3): >>><<< 10929 1726773076.06486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773076.0273564-10929-154149910510445=/root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445 , stderr= 10929 1726773076.06654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/dnf-ZIP_DEFLATED 10929 1726773076.06750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/AnsiballZ_dnf.py 10929 1726773076.08148: Sending initial data 10929 1726773076.08161: Sent initial data (151 bytes) 10929 1726773076.10904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp1q5expc5 /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/AnsiballZ_dnf.py <<< 10929 1726773076.12550: stderr chunk (state=3): >>><<< 10929 1726773076.12558: stdout chunk (state=3): >>><<< 10929 1726773076.12590: done transferring module to remote 10929 1726773076.12615: _low_level_execute_command(): starting 10929 1726773076.12622: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/ /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/AnsiballZ_dnf.py && sleep 0' 10929 1726773076.15939: stderr chunk (state=2): >>><<< 10929 1726773076.15954: stdout chunk (state=2): >>><<< 10929 1726773076.15978: _low_level_execute_command() done: rc=0, stdout=, stderr= 10929 1726773076.15986: _low_level_execute_command(): starting 10929 1726773076.15996: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/AnsiballZ_dnf.py && sleep 0' 10929 1726773078.65230: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10929 1726773078.68368: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10929 1726773078.68417: stderr chunk (state=3): >>><<< 10929 1726773078.68423: stdout chunk (state=3): >>><<< 10929 1726773078.68448: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 10929 1726773078.68514: done with _execute_module (dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10929 1726773078.68523: _low_level_execute_command(): starting 10929 1726773078.68528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773076.0273564-10929-154149910510445/ > /dev/null 2>&1 && sleep 0' 10929 1726773078.71187: stderr chunk (state=2): >>><<< 10929 1726773078.71199: stdout chunk (state=2): >>><<< 10929 1726773078.71218: _low_level_execute_command() done: rc=0, stdout=, stderr= 10929 1726773078.71226: handler run complete 10929 1726773078.71267: attempt loop complete, returning result 10929 1726773078.71287: _execute() done 10929 1726773078.71290: dumping result to json 10929 1726773078.71294: done dumping result, returning 10929 1726773078.71307: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-0000000005e4] 10929 1726773078.71320: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e4 10929 1726773078.71354: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e4 10929 1726773078.71358: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773078.71625: no more pending results, returning what we have 8119 1726773078.71632: results queue empty 8119 1726773078.71634: checking for any_errors_fatal 8119 1726773078.71639: done checking for any_errors_fatal 8119 1726773078.71641: checking for max_fail_percentage 8119 1726773078.71643: done checking for max_fail_percentage 8119 1726773078.71645: checking to see if all hosts have failed and the running result is not ok 8119 1726773078.71647: done checking to see if all hosts have failed 8119 1726773078.71648: getting the remaining hosts for this loop 8119 1726773078.71650: done getting the remaining hosts for this loop 8119 1726773078.71656: building list of next tasks for hosts 8119 1726773078.71658: getting the next task for host managed_node2 8119 1726773078.71665: done getting next task for host managed_node2 8119 1726773078.71668: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773078.71670: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.71672: done building task lists 8119 1726773078.71673: counting tasks in each state of execution 8119 1726773078.71676: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773078.71677: advancing hosts in ITERATING_TASKS 8119 1726773078.71679: starting to advance hosts 8119 1726773078.71681: getting the next task for host managed_node2 8119 1726773078.71686: done getting next task for host managed_node2 8119 1726773078.71690: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773078.71694: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.71695: done advancing hosts to next task 8119 1726773078.71710: Loading ActionModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773078.71713: getting variables 8119 1726773078.71715: in VariableManager get_vars() 8119 1726773078.71740: Calling all_inventory to load vars for managed_node2 8119 1726773078.71744: Calling groups_inventory to load vars for managed_node2 8119 1726773078.71746: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773078.71768: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.71778: Calling all_plugins_play to load vars for managed_node2 8119 1726773078.71791: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.71802: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773078.71819: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.71827: Calling groups_plugins_play to load vars for managed_node2 8119 1726773078.71836: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.71854: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.71868: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.72071: done with get_vars() 8119 1726773078.72082: done getting variables 8119 1726773078.72090: sending task start callback, copying the task so we can template it temporarily 8119 1726773078.72091: done copying, going to template now 8119 1726773078.72093: done templating 8119 1726773078.72095: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:18 -0400 (0:00:02.788) 0:01:13.277 **** 8119 1726773078.72112: sending task start callback 8119 1726773078.72114: entering _queue_task() for managed_node2/debug 8119 1726773078.72246: worker is 1 (out of 1 available) 8119 1726773078.72285: exiting _queue_task() for managed_node2/debug 8119 1726773078.72361: done queuing things up, now waiting for results queue to drain 8119 1726773078.72366: waiting for pending results... 10982 1726773078.72424: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10982 1726773078.72474: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e6 10982 1726773078.72523: calling self._execute() 10982 1726773078.74274: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10982 1726773078.74369: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10982 1726773078.74420: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10982 1726773078.74449: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10982 1726773078.74485: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10982 1726773078.74514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10982 1726773078.74558: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10982 1726773078.74587: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10982 1726773078.74612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10982 1726773078.74699: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10982 1726773078.74720: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10982 1726773078.74739: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10982 1726773078.74992: when evaluation is False, skipping this task 10982 1726773078.74996: _execute() done 10982 1726773078.74998: dumping result to json 10982 1726773078.75000: done dumping result, returning 10982 1726773078.75004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000005e6] 10982 1726773078.75012: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e6 10982 1726773078.75037: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e6 10982 1726773078.75040: WORKER PROCESS EXITING skipping: [managed_node2] => {} 8119 1726773078.75227: no more pending results, returning what we have 8119 1726773078.75232: results queue empty 8119 1726773078.75234: checking for any_errors_fatal 8119 1726773078.75240: done checking for any_errors_fatal 8119 1726773078.75242: checking for max_fail_percentage 8119 1726773078.75245: done checking for max_fail_percentage 8119 1726773078.75247: checking to see if all hosts have failed and the running result is not ok 8119 1726773078.75249: done checking to see if all hosts have failed 8119 1726773078.75251: getting the remaining hosts for this loop 8119 1726773078.75253: done getting the remaining hosts for this loop 8119 1726773078.75261: building list of next tasks for hosts 8119 1726773078.75263: getting the next task for host managed_node2 8119 1726773078.75270: done getting next task for host managed_node2 8119 1726773078.75275: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773078.75280: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.75282: done building task lists 8119 1726773078.75286: counting tasks in each state of execution 8119 1726773078.75290: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773078.75292: advancing hosts in ITERATING_TASKS 8119 1726773078.75295: starting to advance hosts 8119 1726773078.75297: getting the next task for host managed_node2 8119 1726773078.75301: done getting next task for host managed_node2 8119 1726773078.75304: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773078.75307: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.75312: done advancing hosts to next task 8119 1726773078.75324: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773078.75327: getting variables 8119 1726773078.75330: in VariableManager get_vars() 8119 1726773078.75356: Calling all_inventory to load vars for managed_node2 8119 1726773078.75359: Calling groups_inventory to load vars for managed_node2 8119 1726773078.75362: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773078.75382: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75395: Calling all_plugins_play to load vars for managed_node2 8119 1726773078.75406: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75418: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773078.75433: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75440: Calling groups_plugins_play to load vars for managed_node2 8119 1726773078.75450: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75467: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75481: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.75717: done with get_vars() 8119 1726773078.75727: done getting variables 8119 1726773078.75731: sending task start callback, copying the task so we can template it temporarily 8119 1726773078.75733: done copying, going to template now 8119 1726773078.75735: done templating 8119 1726773078.75736: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.036) 0:01:13.313 **** 8119 1726773078.75751: sending task start callback 8119 1726773078.75753: entering _queue_task() for managed_node2/reboot 8119 1726773078.75876: worker is 1 (out of 1 available) 8119 1726773078.75917: exiting _queue_task() for managed_node2/reboot 8119 1726773078.75989: done queuing things up, now waiting for results queue to drain 8119 1726773078.75994: waiting for pending results... 10984 1726773078.76053: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10984 1726773078.76110: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e7 10984 1726773078.76155: calling self._execute() 10984 1726773078.77906: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10984 1726773078.77993: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10984 1726773078.78046: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10984 1726773078.78074: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10984 1726773078.78105: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10984 1726773078.78138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10984 1726773078.78188: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10984 1726773078.78225: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10984 1726773078.78243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10984 1726773078.78326: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10984 1726773078.78343: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10984 1726773078.78357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10984 1726773078.78612: when evaluation is False, skipping this task 10984 1726773078.78617: _execute() done 10984 1726773078.78619: dumping result to json 10984 1726773078.78621: done dumping result, returning 10984 1726773078.78625: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [12a3200b-1e9d-1dbd-cc52-0000000005e7] 10984 1726773078.78632: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e7 10984 1726773078.78657: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e7 10984 1726773078.78661: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773078.78842: no more pending results, returning what we have 8119 1726773078.78846: results queue empty 8119 1726773078.78848: checking for any_errors_fatal 8119 1726773078.78853: done checking for any_errors_fatal 8119 1726773078.78855: checking for max_fail_percentage 8119 1726773078.78857: done checking for max_fail_percentage 8119 1726773078.78859: checking to see if all hosts have failed and the running result is not ok 8119 1726773078.78861: done checking to see if all hosts have failed 8119 1726773078.78863: getting the remaining hosts for this loop 8119 1726773078.78866: done getting the remaining hosts for this loop 8119 1726773078.78872: building list of next tasks for hosts 8119 1726773078.78875: getting the next task for host managed_node2 8119 1726773078.78882: done getting next task for host managed_node2 8119 1726773078.78890: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773078.78894: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.78896: done building task lists 8119 1726773078.78899: counting tasks in each state of execution 8119 1726773078.78903: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773078.78905: advancing hosts in ITERATING_TASKS 8119 1726773078.78907: starting to advance hosts 8119 1726773078.78912: getting the next task for host managed_node2 8119 1726773078.78916: done getting next task for host managed_node2 8119 1726773078.78919: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773078.78922: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.78924: done advancing hosts to next task 8119 1726773078.78938: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773078.78942: getting variables 8119 1726773078.78945: in VariableManager get_vars() 8119 1726773078.78973: Calling all_inventory to load vars for managed_node2 8119 1726773078.78977: Calling groups_inventory to load vars for managed_node2 8119 1726773078.78979: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773078.79001: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79014: Calling all_plugins_play to load vars for managed_node2 8119 1726773078.79025: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79034: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773078.79044: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79050: Calling groups_plugins_play to load vars for managed_node2 8119 1726773078.79060: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79084: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79099: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.79306: done with get_vars() 8119 1726773078.79318: done getting variables 8119 1726773078.79323: sending task start callback, copying the task so we can template it temporarily 8119 1726773078.79325: done copying, going to template now 8119 1726773078.79327: done templating 8119 1726773078.79328: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.035) 0:01:13.349 **** 8119 1726773078.79343: sending task start callback 8119 1726773078.79345: entering _queue_task() for managed_node2/fail 8119 1726773078.79464: worker is 1 (out of 1 available) 8119 1726773078.79504: exiting _queue_task() for managed_node2/fail 8119 1726773078.79579: done queuing things up, now waiting for results queue to drain 8119 1726773078.79586: waiting for pending results... 10986 1726773078.79641: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10986 1726773078.79697: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005e8 10986 1726773078.79743: calling self._execute() 10986 1726773078.81498: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10986 1726773078.81588: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10986 1726773078.81638: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10986 1726773078.81667: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10986 1726773078.81710: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10986 1726773078.81739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10986 1726773078.81801: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10986 1726773078.81828: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10986 1726773078.81846: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10986 1726773078.81926: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10986 1726773078.81946: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10986 1726773078.81960: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10986 1726773078.82217: when evaluation is False, skipping this task 10986 1726773078.82222: _execute() done 10986 1726773078.82224: dumping result to json 10986 1726773078.82226: done dumping result, returning 10986 1726773078.82232: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [12a3200b-1e9d-1dbd-cc52-0000000005e8] 10986 1726773078.82241: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e8 10986 1726773078.82267: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005e8 10986 1726773078.82270: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773078.82469: no more pending results, returning what we have 8119 1726773078.82473: results queue empty 8119 1726773078.82475: checking for any_errors_fatal 8119 1726773078.82479: done checking for any_errors_fatal 8119 1726773078.82481: checking for max_fail_percentage 8119 1726773078.82487: done checking for max_fail_percentage 8119 1726773078.82489: checking to see if all hosts have failed and the running result is not ok 8119 1726773078.82492: done checking to see if all hosts have failed 8119 1726773078.82494: getting the remaining hosts for this loop 8119 1726773078.82497: done getting the remaining hosts for this loop 8119 1726773078.82505: building list of next tasks for hosts 8119 1726773078.82507: getting the next task for host managed_node2 8119 1726773078.82519: done getting next task for host managed_node2 8119 1726773078.82524: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773078.82528: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.82531: done building task lists 8119 1726773078.82533: counting tasks in each state of execution 8119 1726773078.82537: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773078.82539: advancing hosts in ITERATING_TASKS 8119 1726773078.82540: starting to advance hosts 8119 1726773078.82542: getting the next task for host managed_node2 8119 1726773078.82545: done getting next task for host managed_node2 8119 1726773078.82547: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773078.82549: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773078.82551: done advancing hosts to next task 8119 1726773078.82586: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773078.82591: getting variables 8119 1726773078.82593: in VariableManager get_vars() 8119 1726773078.82622: Calling all_inventory to load vars for managed_node2 8119 1726773078.82625: Calling groups_inventory to load vars for managed_node2 8119 1726773078.82628: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773078.82648: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82658: Calling all_plugins_play to load vars for managed_node2 8119 1726773078.82668: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82677: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773078.82689: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82697: Calling groups_plugins_play to load vars for managed_node2 8119 1726773078.82706: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82728: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82743: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773078.82953: done with get_vars() 8119 1726773078.82964: done getting variables 8119 1726773078.82968: sending task start callback, copying the task so we can template it temporarily 8119 1726773078.82969: done copying, going to template now 8119 1726773078.82971: done templating 8119 1726773078.82973: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.036) 0:01:13.386 **** 8119 1726773078.82991: sending task start callback 8119 1726773078.82993: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773078.83124: worker is 1 (out of 1 available) 8119 1726773078.83160: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773078.83237: done queuing things up, now waiting for results queue to drain 8119 1726773078.83242: waiting for pending results... 10988 1726773078.83300: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10988 1726773078.83350: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005ea 10988 1726773078.83396: calling self._execute() 10988 1726773078.85168: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10988 1726773078.85251: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10988 1726773078.85313: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10988 1726773078.85340: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10988 1726773078.85373: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10988 1726773078.85410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10988 1726773078.85454: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10988 1726773078.85478: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10988 1726773078.85499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10988 1726773078.85577: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10988 1726773078.85601: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10988 1726773078.85619: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10988 1726773078.85844: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10988 1726773078.85879: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10988 1726773078.85892: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10988 1726773078.85903: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10988 1726773078.85908: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10988 1726773078.85992: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10988 1726773078.86005: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10988 1726773078.86029: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10988 1726773078.86045: starting attempt loop 10988 1726773078.86049: running the handler 10988 1726773078.86056: _low_level_execute_command(): starting 10988 1726773078.86060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10988 1726773078.88737: stdout chunk (state=2): >>>/root <<< 10988 1726773078.88864: stderr chunk (state=3): >>><<< 10988 1726773078.88870: stdout chunk (state=3): >>><<< 10988 1726773078.88892: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10988 1726773078.88912: _low_level_execute_command(): starting 10988 1726773078.88920: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004 `" && echo ansible-tmp-1726773078.8890412-10988-195421346699004="` echo /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004 `" ) && sleep 0' 10988 1726773078.91889: stdout chunk (state=2): >>>ansible-tmp-1726773078.8890412-10988-195421346699004=/root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004 <<< 10988 1726773078.92019: stderr chunk (state=3): >>><<< 10988 1726773078.92025: stdout chunk (state=3): >>><<< 10988 1726773078.92049: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773078.8890412-10988-195421346699004=/root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004 , stderr= 10988 1726773078.92127: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 10988 1726773078.92191: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/AnsiballZ_kernel_settings_get_config.py 10988 1726773078.92512: Sending initial data 10988 1726773078.92527: Sent initial data (174 bytes) 10988 1726773078.95078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmplwkj5vk2 /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/AnsiballZ_kernel_settings_get_config.py <<< 10988 1726773078.96073: stderr chunk (state=3): >>><<< 10988 1726773078.96078: stdout chunk (state=3): >>><<< 10988 1726773078.96102: done transferring module to remote 10988 1726773078.96119: _low_level_execute_command(): starting 10988 1726773078.96124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/ /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10988 1726773078.98672: stderr chunk (state=2): >>><<< 10988 1726773078.98684: stdout chunk (state=2): >>><<< 10988 1726773078.98702: _low_level_execute_command() done: rc=0, stdout=, stderr= 10988 1726773078.98706: _low_level_execute_command(): starting 10988 1726773078.98714: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10988 1726773079.13874: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10988 1726773079.15034: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10988 1726773079.15076: stderr chunk (state=3): >>><<< 10988 1726773079.15080: stdout chunk (state=3): >>><<< 10988 1726773079.15102: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 10988 1726773079.15135: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10988 1726773079.15151: _low_level_execute_command(): starting 10988 1726773079.15156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773078.8890412-10988-195421346699004/ > /dev/null 2>&1 && sleep 0' 10988 1726773079.17902: stderr chunk (state=2): >>><<< 10988 1726773079.17916: stdout chunk (state=2): >>><<< 10988 1726773079.17938: _low_level_execute_command() done: rc=0, stdout=, stderr= 10988 1726773079.17947: handler run complete 10988 1726773079.17976: attempt loop complete, returning result 10988 1726773079.17991: _execute() done 10988 1726773079.17994: dumping result to json 10988 1726773079.17997: done dumping result, returning 10988 1726773079.18011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [12a3200b-1e9d-1dbd-cc52-0000000005ea] 10988 1726773079.18024: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ea 10988 1726773079.18067: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ea 10988 1726773079.18108: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8119 1726773079.18312: no more pending results, returning what we have 8119 1726773079.18319: results queue empty 8119 1726773079.18322: checking for any_errors_fatal 8119 1726773079.18326: done checking for any_errors_fatal 8119 1726773079.18328: checking for max_fail_percentage 8119 1726773079.18331: done checking for max_fail_percentage 8119 1726773079.18333: checking to see if all hosts have failed and the running result is not ok 8119 1726773079.18335: done checking to see if all hosts have failed 8119 1726773079.18337: getting the remaining hosts for this loop 8119 1726773079.18340: done getting the remaining hosts for this loop 8119 1726773079.18347: building list of next tasks for hosts 8119 1726773079.18350: getting the next task for host managed_node2 8119 1726773079.18356: done getting next task for host managed_node2 8119 1726773079.18360: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773079.18364: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.18366: done building task lists 8119 1726773079.18368: counting tasks in each state of execution 8119 1726773079.18372: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773079.18374: advancing hosts in ITERATING_TASKS 8119 1726773079.18375: starting to advance hosts 8119 1726773079.18376: getting the next task for host managed_node2 8119 1726773079.18379: done getting next task for host managed_node2 8119 1726773079.18381: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773079.18384: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.18388: done advancing hosts to next task 8119 1726773079.18401: getting variables 8119 1726773079.18404: in VariableManager get_vars() 8119 1726773079.18432: Calling all_inventory to load vars for managed_node2 8119 1726773079.18436: Calling groups_inventory to load vars for managed_node2 8119 1726773079.18438: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773079.18460: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18470: Calling all_plugins_play to load vars for managed_node2 8119 1726773079.18480: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18493: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773079.18512: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18520: Calling groups_plugins_play to load vars for managed_node2 8119 1726773079.18530: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18548: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18561: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.18769: done with get_vars() 8119 1726773079.18778: done getting variables 8119 1726773079.18785: sending task start callback, copying the task so we can template it temporarily 8119 1726773079.18787: done copying, going to template now 8119 1726773079.18789: done templating 8119 1726773079.18791: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.358) 0:01:13.744 **** 8119 1726773079.18806: sending task start callback 8119 1726773079.18810: entering _queue_task() for managed_node2/stat 8119 1726773079.18942: worker is 1 (out of 1 available) 8119 1726773079.18981: exiting _queue_task() for managed_node2/stat 8119 1726773079.19056: done queuing things up, now waiting for results queue to drain 8119 1726773079.19062: waiting for pending results... 10997 1726773079.19124: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10997 1726773079.19174: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005eb 10997 1726773079.20937: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 10997 1726773079.21020: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 10997 1726773079.21074: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 10997 1726773079.21104: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 10997 1726773079.21133: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 10997 1726773079.21171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 10997 1726773079.21235: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 10997 1726773079.21257: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 10997 1726773079.21273: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 10997 1726773079.21358: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 10997 1726773079.21374: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 10997 1726773079.21391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 10997 1726773079.21738: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10997 1726773079.21742: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10997 1726773079.21744: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10997 1726773079.21746: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10997 1726773079.21748: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10997 1726773079.21750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.21752: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10997 1726773079.21754: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10997 1726773079.21757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10997 1726773079.21776: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10997 1726773079.21780: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10997 1726773079.21782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22058: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22062: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10997 1726773079.22065: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10997 1726773079.22067: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10997 1726773079.22068: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10997 1726773079.22070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22073: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10997 1726773079.22074: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10997 1726773079.22076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10997 1726773079.22097: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22102: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10997 1726773079.22105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22272: when evaluation is False, skipping this task 10997 1726773079.22308: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22313: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10997 1726773079.22316: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10997 1726773079.22318: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10997 1726773079.22320: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10997 1726773079.22321: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22323: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10997 1726773079.22326: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10997 1726773079.22329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10997 1726773079.22351: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22354: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10997 1726773079.22356: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22538: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22544: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10997 1726773079.22548: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10997 1726773079.22551: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10997 1726773079.22554: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10997 1726773079.22557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.22560: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10997 1726773079.22563: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10997 1726773079.22566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10997 1726773079.22597: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10997 1726773079.22602: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10997 1726773079.22605: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "item": "", "skip_reason": "Conditional result was False" } 10997 1726773079.22880: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 10997 1726773079.22920: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10997 1726773079.22930: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 10997 1726773079.22940: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10997 1726773079.22945: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10997 1726773079.23030: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10997 1726773079.23043: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10997 1726773079.23067: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 10997 1726773079.23082: starting attempt loop 10997 1726773079.23087: running the handler 10997 1726773079.23094: _low_level_execute_command(): starting 10997 1726773079.23098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10997 1726773079.25680: stdout chunk (state=2): >>>/root <<< 10997 1726773079.25898: stderr chunk (state=3): >>><<< 10997 1726773079.25904: stdout chunk (state=3): >>><<< 10997 1726773079.25930: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10997 1726773079.25946: _low_level_execute_command(): starting 10997 1726773079.25952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213 `" && echo ansible-tmp-1726773079.2594004-10997-214701626800213="` echo /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213 `" ) && sleep 0' 10997 1726773079.28953: stdout chunk (state=2): >>>ansible-tmp-1726773079.2594004-10997-214701626800213=/root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213 <<< 10997 1726773079.29087: stderr chunk (state=3): >>><<< 10997 1726773079.29093: stdout chunk (state=3): >>><<< 10997 1726773079.29113: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773079.2594004-10997-214701626800213=/root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213 , stderr= 10997 1726773079.29202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10997 1726773079.29264: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/AnsiballZ_stat.py 10997 1726773079.29611: Sending initial data 10997 1726773079.29626: Sent initial data (152 bytes) 10997 1726773079.32262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp5qn8waup /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/AnsiballZ_stat.py <<< 10997 1726773079.33410: stderr chunk (state=3): >>><<< 10997 1726773079.33419: stdout chunk (state=3): >>><<< 10997 1726773079.33449: done transferring module to remote 10997 1726773079.33467: _low_level_execute_command(): starting 10997 1726773079.33473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/ /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/AnsiballZ_stat.py && sleep 0' 10997 1726773079.36499: stderr chunk (state=2): >>><<< 10997 1726773079.36517: stdout chunk (state=2): >>><<< 10997 1726773079.36544: _low_level_execute_command() done: rc=0, stdout=, stderr= 10997 1726773079.36551: _low_level_execute_command(): starting 10997 1726773079.36560: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/AnsiballZ_stat.py && sleep 0' 10997 1726773079.52323: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10997 1726773079.53410: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10997 1726773079.53456: stderr chunk (state=3): >>><<< 10997 1726773079.53463: stdout chunk (state=3): >>><<< 10997 1726773079.53485: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 10997 1726773079.53513: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10997 1726773079.53526: _low_level_execute_command(): starting 10997 1726773079.53532: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773079.2594004-10997-214701626800213/ > /dev/null 2>&1 && sleep 0' 10997 1726773079.56257: stderr chunk (state=2): >>><<< 10997 1726773079.56269: stdout chunk (state=2): >>><<< 10997 1726773079.56290: _low_level_execute_command() done: rc=0, stdout=, stderr= 10997 1726773079.56297: handler run complete 10997 1726773079.56323: attempt loop complete, returning result 10997 1726773079.56653: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10997 1726773079.56660: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 10997 1726773079.56663: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 10997 1726773079.56666: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 10997 1726773079.56668: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 10997 1726773079.56670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10997 1726773079.56672: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 10997 1726773079.56674: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10997 1726773079.56676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10997 1726773079.56705: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10997 1726773079.56710: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10997 1726773079.56714: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10997 1726773079.57002: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10997 1726773079.57012: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10997 1726773079.57018: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10997 1726773079.57089: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10997 1726773079.57104: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 10997 1726773079.57110: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10997 1726773079.57119: starting attempt loop 10997 1726773079.57122: running the handler 10997 1726773079.57129: _low_level_execute_command(): starting 10997 1726773079.57133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10997 1726773079.59764: stdout chunk (state=2): >>>/root <<< 10997 1726773079.59888: stderr chunk (state=3): >>><<< 10997 1726773079.59894: stdout chunk (state=3): >>><<< 10997 1726773079.59918: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10997 1726773079.59934: _low_level_execute_command(): starting 10997 1726773079.59941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773 `" && echo ansible-tmp-1726773079.5992866-10997-133402408609773="` echo /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773 `" ) && sleep 0' 10997 1726773079.62877: stdout chunk (state=2): >>>ansible-tmp-1726773079.5992866-10997-133402408609773=/root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773 <<< 10997 1726773079.63006: stderr chunk (state=3): >>><<< 10997 1726773079.63017: stdout chunk (state=3): >>><<< 10997 1726773079.63034: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773079.5992866-10997-133402408609773=/root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773 , stderr= 10997 1726773079.63121: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 10997 1726773079.63171: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/AnsiballZ_stat.py 10997 1726773079.63479: Sending initial data 10997 1726773079.63500: Sent initial data (152 bytes) 10997 1726773079.66326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpwmq00y50 /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/AnsiballZ_stat.py <<< 10997 1726773079.67338: stderr chunk (state=3): >>><<< 10997 1726773079.67347: stdout chunk (state=3): >>><<< 10997 1726773079.67374: done transferring module to remote 10997 1726773079.67389: _low_level_execute_command(): starting 10997 1726773079.67395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/ /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/AnsiballZ_stat.py && sleep 0' 10997 1726773079.69998: stderr chunk (state=2): >>><<< 10997 1726773079.70014: stdout chunk (state=2): >>><<< 10997 1726773079.70037: _low_level_execute_command() done: rc=0, stdout=, stderr= 10997 1726773079.70041: _low_level_execute_command(): starting 10997 1726773079.70048: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/AnsiballZ_stat.py && sleep 0' 10997 1726773079.85861: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10997 1726773079.87035: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 10997 1726773079.87074: stderr chunk (state=3): >>><<< 10997 1726773079.87079: stdout chunk (state=3): >>><<< 10997 1726773079.87103: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 10997 1726773079.87163: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10997 1726773079.87174: _low_level_execute_command(): starting 10997 1726773079.87179: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773079.5992866-10997-133402408609773/ > /dev/null 2>&1 && sleep 0' 10997 1726773079.89928: stderr chunk (state=2): >>><<< 10997 1726773079.89940: stdout chunk (state=2): >>><<< 10997 1726773079.89963: _low_level_execute_command() done: rc=0, stdout=, stderr= 10997 1726773079.89969: handler run complete 10997 1726773079.90017: attempt loop complete, returning result 10997 1726773079.90187: dumping result to json 10997 1726773079.90240: done dumping result, returning 10997 1726773079.90256: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [12a3200b-1e9d-1dbd-cc52-0000000005eb] 10997 1726773079.90263: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005eb 10997 1726773079.90267: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005eb 10997 1726773079.90269: WORKER PROCESS EXITING ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773035.2883239, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773033.0853279, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773033.0853279, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8119 1726773079.90488: no more pending results, returning what we have 8119 1726773079.90497: results queue empty 8119 1726773079.90499: checking for any_errors_fatal 8119 1726773079.90504: done checking for any_errors_fatal 8119 1726773079.90506: checking for max_fail_percentage 8119 1726773079.90511: done checking for max_fail_percentage 8119 1726773079.90513: checking to see if all hosts have failed and the running result is not ok 8119 1726773079.90515: done checking to see if all hosts have failed 8119 1726773079.90517: getting the remaining hosts for this loop 8119 1726773079.90520: done getting the remaining hosts for this loop 8119 1726773079.90527: building list of next tasks for hosts 8119 1726773079.90530: getting the next task for host managed_node2 8119 1726773079.90537: done getting next task for host managed_node2 8119 1726773079.90540: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773079.90544: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.90547: done building task lists 8119 1726773079.90549: counting tasks in each state of execution 8119 1726773079.90553: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773079.90555: advancing hosts in ITERATING_TASKS 8119 1726773079.90557: starting to advance hosts 8119 1726773079.90560: getting the next task for host managed_node2 8119 1726773079.90563: done getting next task for host managed_node2 8119 1726773079.90566: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773079.90569: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.90571: done advancing hosts to next task 8119 1726773079.90587: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773079.90592: getting variables 8119 1726773079.90595: in VariableManager get_vars() 8119 1726773079.90627: Calling all_inventory to load vars for managed_node2 8119 1726773079.90631: Calling groups_inventory to load vars for managed_node2 8119 1726773079.90633: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773079.90668: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.90682: Calling all_plugins_play to load vars for managed_node2 8119 1726773079.90700: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.90713: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773079.90729: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.90747: Calling groups_plugins_play to load vars for managed_node2 8119 1726773079.90760: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.90779: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.90795: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.91120: done with get_vars() 8119 1726773079.91136: done getting variables 8119 1726773079.91143: sending task start callback, copying the task so we can template it temporarily 8119 1726773079.91146: done copying, going to template now 8119 1726773079.91149: done templating 8119 1726773079.91152: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.723) 0:01:14.468 **** 8119 1726773079.91176: sending task start callback 8119 1726773079.91179: entering _queue_task() for managed_node2/set_fact 8119 1726773079.91340: worker is 1 (out of 1 available) 8119 1726773079.91375: exiting _queue_task() for managed_node2/set_fact 8119 1726773079.91454: done queuing things up, now waiting for results queue to drain 8119 1726773079.91459: waiting for pending results... 11037 1726773079.91680: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11037 1726773079.91751: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005ec 11037 1726773079.91806: calling self._execute() 11037 1726773079.93800: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11037 1726773079.93886: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11037 1726773079.93941: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11037 1726773079.93968: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11037 1726773079.94004: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11037 1726773079.94033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11037 1726773079.94078: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11037 1726773079.94118: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11037 1726773079.94136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11037 1726773079.94218: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11037 1726773079.94238: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11037 1726773079.94252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11037 1726773079.94678: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11037 1726773079.94723: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11037 1726773079.94735: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11037 1726773079.94745: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11037 1726773079.94758: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11037 1726773079.94880: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11037 1726773079.94905: starting attempt loop 11037 1726773079.94909: running the handler 11037 1726773079.94925: handler run complete 11037 1726773079.94929: attempt loop complete, returning result 11037 1726773079.94932: _execute() done 11037 1726773079.94934: dumping result to json 11037 1726773079.94942: done dumping result, returning 11037 1726773079.94949: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [12a3200b-1e9d-1dbd-cc52-0000000005ec] 11037 1726773079.94958: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ec 11037 1726773079.94987: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ec 11037 1726773079.94991: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8119 1726773079.95684: no more pending results, returning what we have 8119 1726773079.95691: results queue empty 8119 1726773079.95693: checking for any_errors_fatal 8119 1726773079.95702: done checking for any_errors_fatal 8119 1726773079.95704: checking for max_fail_percentage 8119 1726773079.95706: done checking for max_fail_percentage 8119 1726773079.95707: checking to see if all hosts have failed and the running result is not ok 8119 1726773079.95710: done checking to see if all hosts have failed 8119 1726773079.95711: getting the remaining hosts for this loop 8119 1726773079.95713: done getting the remaining hosts for this loop 8119 1726773079.95719: building list of next tasks for hosts 8119 1726773079.95721: getting the next task for host managed_node2 8119 1726773079.95727: done getting next task for host managed_node2 8119 1726773079.95730: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773079.95732: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.95734: done building task lists 8119 1726773079.95735: counting tasks in each state of execution 8119 1726773079.95738: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773079.95739: advancing hosts in ITERATING_TASKS 8119 1726773079.95741: starting to advance hosts 8119 1726773079.95742: getting the next task for host managed_node2 8119 1726773079.95745: done getting next task for host managed_node2 8119 1726773079.95748: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773079.95751: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773079.95754: done advancing hosts to next task 8119 1726773079.95768: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773079.95776: getting variables 8119 1726773079.95778: in VariableManager get_vars() 8119 1726773079.95807: Calling all_inventory to load vars for managed_node2 8119 1726773079.95814: Calling groups_inventory to load vars for managed_node2 8119 1726773079.95816: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773079.95837: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.95848: Calling all_plugins_play to load vars for managed_node2 8119 1726773079.95859: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.95873: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773079.95889: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.95900: Calling groups_plugins_play to load vars for managed_node2 8119 1726773079.95917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.95946: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.95967: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773079.96189: done with get_vars() 8119 1726773079.96200: done getting variables 8119 1726773079.96204: sending task start callback, copying the task so we can template it temporarily 8119 1726773079.96206: done copying, going to template now 8119 1726773079.96210: done templating 8119 1726773079.96212: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.050) 0:01:14.518 **** 8119 1726773079.96227: sending task start callback 8119 1726773079.96229: entering _queue_task() for managed_node2/service 8119 1726773079.96372: worker is 1 (out of 1 available) 8119 1726773079.96413: exiting _queue_task() for managed_node2/service 8119 1726773079.96485: done queuing things up, now waiting for results queue to drain 8119 1726773079.96491: waiting for pending results... 11041 1726773079.96713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11041 1726773079.96777: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005ed 11041 1726773079.98681: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11041 1726773079.98767: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11041 1726773079.98830: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11041 1726773079.98860: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11041 1726773079.98888: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11041 1726773079.98921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11041 1726773079.98976: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11041 1726773079.99002: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11041 1726773079.99022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11041 1726773079.99107: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11041 1726773079.99128: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11041 1726773079.99145: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11041 1726773079.99341: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11041 1726773079.99347: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11041 1726773079.99351: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11041 1726773079.99354: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11041 1726773079.99357: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11041 1726773079.99360: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773079.99362: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11041 1726773079.99365: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11041 1726773079.99368: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11041 1726773079.99393: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11041 1726773079.99398: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11041 1726773079.99402: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773079.99617: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11041 1726773079.99623: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11041 1726773079.99627: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11041 1726773079.99630: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11041 1726773079.99633: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11041 1726773079.99635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773079.99638: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11041 1726773079.99641: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11041 1726773079.99644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11041 1726773079.99670: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11041 1726773079.99674: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11041 1726773079.99676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773079.99806: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11041 1726773079.99859: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11041 1726773079.99875: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11041 1726773079.99894: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11041 1726773079.99903: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11041 1726773080.00037: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11041 1726773080.00057: starting attempt loop 11041 1726773080.00061: running the handler 11041 1726773080.00245: _low_level_execute_command(): starting 11041 1726773080.00253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11041 1726773080.02817: stdout chunk (state=2): >>>/root <<< 11041 1726773080.02968: stderr chunk (state=3): >>><<< 11041 1726773080.02974: stdout chunk (state=3): >>><<< 11041 1726773080.02996: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11041 1726773080.03012: _low_level_execute_command(): starting 11041 1726773080.03020: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448 `" && echo ansible-tmp-1726773080.0300364-11041-163341813711448="` echo /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448 `" ) && sleep 0' 11041 1726773080.05829: stdout chunk (state=2): >>>ansible-tmp-1726773080.0300364-11041-163341813711448=/root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448 <<< 11041 1726773080.05959: stderr chunk (state=3): >>><<< 11041 1726773080.05966: stdout chunk (state=3): >>><<< 11041 1726773080.05985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.0300364-11041-163341813711448=/root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448 , stderr= 11041 1726773080.06100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 11041 1726773080.06195: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/AnsiballZ_systemd.py 11041 1726773080.06493: Sending initial data 11041 1726773080.06511: Sent initial data (155 bytes) 11041 1726773080.08956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpl2mg_xav /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/AnsiballZ_systemd.py <<< 11041 1726773080.10770: stderr chunk (state=3): >>><<< 11041 1726773080.10776: stdout chunk (state=3): >>><<< 11041 1726773080.10802: done transferring module to remote 11041 1726773080.10816: _low_level_execute_command(): starting 11041 1726773080.10821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/ /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/AnsiballZ_systemd.py && sleep 0' 11041 1726773080.13453: stderr chunk (state=2): >>><<< 11041 1726773080.13465: stdout chunk (state=2): >>><<< 11041 1726773080.13486: _low_level_execute_command() done: rc=0, stdout=, stderr= 11041 1726773080.13490: _low_level_execute_command(): starting 11041 1726773080.13497: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/AnsiballZ_systemd.py && sleep 0' 11041 1726773080.39389: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18825216", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11041 1726773080.39496: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChange<<< 11041 1726773080.39507: stdout chunk (state=3): >>>TimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 11041 1726773080.40860: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11041 1726773080.40912: stderr chunk (state=3): >>><<< 11041 1726773080.40918: stdout chunk (state=3): >>><<< 11041 1726773080.40939: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18825216", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11041 1726773080.41050: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11041 1726773080.41066: _low_level_execute_command(): starting 11041 1726773080.41072: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.0300364-11041-163341813711448/ > /dev/null 2>&1 && sleep 0' 11041 1726773080.43810: stderr chunk (state=2): >>><<< 11041 1726773080.43826: stdout chunk (state=2): >>><<< 11041 1726773080.43846: _low_level_execute_command() done: rc=0, stdout=, stderr= 11041 1726773080.43854: handler run complete 11041 1726773080.43859: attempt loop complete, returning result 11041 1726773080.43936: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11041 1726773080.43943: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11041 1726773080.43945: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11041 1726773080.43948: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11041 1726773080.43950: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11041 1726773080.43952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773080.43954: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11041 1726773080.43956: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11041 1726773080.43958: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11041 1726773080.43994: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11041 1726773080.43998: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11041 1726773080.44000: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11041 1726773080.44153: dumping result to json 11041 1726773080.44260: done dumping result, returning 11041 1726773080.44276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-0000000005ed] 11041 1726773080.44291: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ed 11041 1726773080.44297: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ed 11041 1726773080.44299: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "658", "MemoryAccounting": "yes", "MemoryCurrent": "18825216", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "WatchdogUSec": "0" } } 8119 1726773080.44817: no more pending results, returning what we have 8119 1726773080.44823: results queue empty 8119 1726773080.44825: checking for any_errors_fatal 8119 1726773080.44828: done checking for any_errors_fatal 8119 1726773080.44829: checking for max_fail_percentage 8119 1726773080.44832: done checking for max_fail_percentage 8119 1726773080.44833: checking to see if all hosts have failed and the running result is not ok 8119 1726773080.44834: done checking to see if all hosts have failed 8119 1726773080.44836: getting the remaining hosts for this loop 8119 1726773080.44837: done getting the remaining hosts for this loop 8119 1726773080.44843: building list of next tasks for hosts 8119 1726773080.44844: getting the next task for host managed_node2 8119 1726773080.44850: done getting next task for host managed_node2 8119 1726773080.44852: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773080.44855: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773080.44857: done building task lists 8119 1726773080.44858: counting tasks in each state of execution 8119 1726773080.44861: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773080.44862: advancing hosts in ITERATING_TASKS 8119 1726773080.44863: starting to advance hosts 8119 1726773080.44865: getting the next task for host managed_node2 8119 1726773080.44868: done getting next task for host managed_node2 8119 1726773080.44870: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773080.44872: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773080.44873: done advancing hosts to next task 8119 1726773080.44887: getting variables 8119 1726773080.44890: in VariableManager get_vars() 8119 1726773080.44920: Calling all_inventory to load vars for managed_node2 8119 1726773080.44925: Calling groups_inventory to load vars for managed_node2 8119 1726773080.44927: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773080.44949: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.44959: Calling all_plugins_play to load vars for managed_node2 8119 1726773080.44969: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.44978: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773080.44992: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.44999: Calling groups_plugins_play to load vars for managed_node2 8119 1726773080.45011: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.45035: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.45050: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.45257: done with get_vars() 8119 1726773080.45268: done getting variables 8119 1726773080.45273: sending task start callback, copying the task so we can template it temporarily 8119 1726773080.45274: done copying, going to template now 8119 1726773080.45279: done templating 8119 1726773080.45286: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.490) 0:01:15.009 **** 8119 1726773080.45314: sending task start callback 8119 1726773080.45317: entering _queue_task() for managed_node2/file 8119 1726773080.45452: worker is 1 (out of 1 available) 8119 1726773080.45492: exiting _queue_task() for managed_node2/file 8119 1726773080.45565: done queuing things up, now waiting for results queue to drain 8119 1726773080.45570: waiting for pending results... 11059 1726773080.45641: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11059 1726773080.45694: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005ee 11059 1726773080.45745: calling self._execute() 11059 1726773080.47930: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11059 1726773080.48020: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11059 1726773080.48080: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11059 1726773080.48122: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11059 1726773080.48152: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11059 1726773080.48184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11059 1726773080.48232: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11059 1726773080.48321: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11059 1726773080.48343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11059 1726773080.48428: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11059 1726773080.48445: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11059 1726773080.48462: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11059 1726773080.48700: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11059 1726773080.48735: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11059 1726773080.48748: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11059 1726773080.48759: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11059 1726773080.48764: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11059 1726773080.48850: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11059 1726773080.48868: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11059 1726773080.48894: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11059 1726773080.48911: starting attempt loop 11059 1726773080.48914: running the handler 11059 1726773080.48923: _low_level_execute_command(): starting 11059 1726773080.48927: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11059 1726773080.51607: stdout chunk (state=2): >>>/root <<< 11059 1726773080.51869: stderr chunk (state=3): >>><<< 11059 1726773080.51876: stdout chunk (state=3): >>><<< 11059 1726773080.51905: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11059 1726773080.51923: _low_level_execute_command(): starting 11059 1726773080.51930: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599 `" && echo ansible-tmp-1726773080.5191607-11059-244872418963599="` echo /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599 `" ) && sleep 0' 11059 1726773080.55498: stdout chunk (state=2): >>>ansible-tmp-1726773080.5191607-11059-244872418963599=/root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599 <<< 11059 1726773080.55645: stderr chunk (state=3): >>><<< 11059 1726773080.55654: stdout chunk (state=3): >>><<< 11059 1726773080.55680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.5191607-11059-244872418963599=/root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599 , stderr= 11059 1726773080.55800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11059 1726773080.55879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/AnsiballZ_file.py 11059 1726773080.56628: Sending initial data 11059 1726773080.56642: Sent initial data (152 bytes) 11059 1726773080.59434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp3kkwr1kr /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/AnsiballZ_file.py <<< 11059 1726773080.60600: stderr chunk (state=3): >>><<< 11059 1726773080.60607: stdout chunk (state=3): >>><<< 11059 1726773080.60633: done transferring module to remote 11059 1726773080.60647: _low_level_execute_command(): starting 11059 1726773080.60651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/ /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/AnsiballZ_file.py && sleep 0' 11059 1726773080.63277: stderr chunk (state=2): >>><<< 11059 1726773080.63291: stdout chunk (state=2): >>><<< 11059 1726773080.63313: _low_level_execute_command() done: rc=0, stdout=, stderr= 11059 1726773080.63318: _low_level_execute_command(): starting 11059 1726773080.63325: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/AnsiballZ_file.py && sleep 0' 11059 1726773080.79188: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11059 1726773080.80237: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11059 1726773080.80286: stderr chunk (state=3): >>><<< 11059 1726773080.80291: stdout chunk (state=3): >>><<< 11059 1726773080.80315: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11059 1726773080.80352: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11059 1726773080.80365: _low_level_execute_command(): starting 11059 1726773080.80370: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.5191607-11059-244872418963599/ > /dev/null 2>&1 && sleep 0' 11059 1726773080.83050: stderr chunk (state=2): >>><<< 11059 1726773080.83061: stdout chunk (state=2): >>><<< 11059 1726773080.83078: _low_level_execute_command() done: rc=0, stdout=, stderr= 11059 1726773080.83085: handler run complete 11059 1726773080.83091: attempt loop complete, returning result 11059 1726773080.83104: _execute() done 11059 1726773080.83107: dumping result to json 11059 1726773080.83113: done dumping result, returning 11059 1726773080.83128: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-0000000005ee] 11059 1726773080.83145: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ee 11059 1726773080.83182: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ee 11059 1726773080.83230: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8119 1726773080.83371: no more pending results, returning what we have 8119 1726773080.83376: results queue empty 8119 1726773080.83377: checking for any_errors_fatal 8119 1726773080.83391: done checking for any_errors_fatal 8119 1726773080.83394: checking for max_fail_percentage 8119 1726773080.83398: done checking for max_fail_percentage 8119 1726773080.83400: checking to see if all hosts have failed and the running result is not ok 8119 1726773080.83402: done checking to see if all hosts have failed 8119 1726773080.83404: getting the remaining hosts for this loop 8119 1726773080.83407: done getting the remaining hosts for this loop 8119 1726773080.83415: building list of next tasks for hosts 8119 1726773080.83419: getting the next task for host managed_node2 8119 1726773080.83427: done getting next task for host managed_node2 8119 1726773080.83431: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773080.83435: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773080.83437: done building task lists 8119 1726773080.83439: counting tasks in each state of execution 8119 1726773080.83442: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773080.83444: advancing hosts in ITERATING_TASKS 8119 1726773080.83447: starting to advance hosts 8119 1726773080.83449: getting the next task for host managed_node2 8119 1726773080.83452: done getting next task for host managed_node2 8119 1726773080.83455: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773080.83458: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773080.83460: done advancing hosts to next task 8119 1726773080.83475: getting variables 8119 1726773080.83478: in VariableManager get_vars() 8119 1726773080.83518: Calling all_inventory to load vars for managed_node2 8119 1726773080.83525: Calling groups_inventory to load vars for managed_node2 8119 1726773080.83529: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773080.83557: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83573: Calling all_plugins_play to load vars for managed_node2 8119 1726773080.83591: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83605: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773080.83622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83630: Calling groups_plugins_play to load vars for managed_node2 8119 1726773080.83640: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83658: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83672: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773080.83885: done with get_vars() 8119 1726773080.83896: done getting variables 8119 1726773080.83900: sending task start callback, copying the task so we can template it temporarily 8119 1726773080.83902: done copying, going to template now 8119 1726773080.83903: done templating 8119 1726773080.83905: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.386) 0:01:15.395 **** 8119 1726773080.83922: sending task start callback 8119 1726773080.83924: entering _queue_task() for managed_node2/slurp 8119 1726773080.84045: worker is 1 (out of 1 available) 8119 1726773080.84085: exiting _queue_task() for managed_node2/slurp 8119 1726773080.84157: done queuing things up, now waiting for results queue to drain 8119 1726773080.84162: waiting for pending results... 11081 1726773080.84234: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11081 1726773080.84286: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005ef 11081 1726773080.84335: calling self._execute() 11081 1726773080.86116: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11081 1726773080.86203: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11081 1726773080.86264: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11081 1726773080.86301: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11081 1726773080.86331: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11081 1726773080.86358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11081 1726773080.86405: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11081 1726773080.86430: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11081 1726773080.86446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11081 1726773080.86540: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11081 1726773080.86557: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11081 1726773080.86570: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11081 1726773080.86786: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11081 1726773080.86819: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11081 1726773080.86830: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11081 1726773080.86840: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11081 1726773080.86847: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11081 1726773080.86928: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11081 1726773080.86941: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11081 1726773080.86966: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11081 1726773080.86985: starting attempt loop 11081 1726773080.86988: running the handler 11081 1726773080.86996: _low_level_execute_command(): starting 11081 1726773080.87000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11081 1726773080.89432: stdout chunk (state=2): >>>/root <<< 11081 1726773080.89556: stderr chunk (state=3): >>><<< 11081 1726773080.89561: stdout chunk (state=3): >>><<< 11081 1726773080.89579: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11081 1726773080.89597: _low_level_execute_command(): starting 11081 1726773080.89603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417 `" && echo ansible-tmp-1726773080.895905-11081-267784801451417="` echo /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417 `" ) && sleep 0' 11081 1726773080.92351: stdout chunk (state=2): >>>ansible-tmp-1726773080.895905-11081-267784801451417=/root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417 <<< 11081 1726773080.92471: stderr chunk (state=3): >>><<< 11081 1726773080.92476: stdout chunk (state=3): >>><<< 11081 1726773080.92497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.895905-11081-267784801451417=/root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417 , stderr= 11081 1726773080.92573: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/slurp-ZIP_DEFLATED 11081 1726773080.92632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/AnsiballZ_slurp.py 11081 1726773080.92928: Sending initial data 11081 1726773080.92944: Sent initial data (152 bytes) 11081 1726773080.95406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpil3aji1a /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/AnsiballZ_slurp.py <<< 11081 1726773080.96414: stderr chunk (state=3): >>><<< 11081 1726773080.96421: stdout chunk (state=3): >>><<< 11081 1726773080.96448: done transferring module to remote 11081 1726773080.96463: _low_level_execute_command(): starting 11081 1726773080.96467: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/ /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/AnsiballZ_slurp.py && sleep 0' 11081 1726773080.99054: stderr chunk (state=2): >>><<< 11081 1726773080.99068: stdout chunk (state=2): >>><<< 11081 1726773080.99095: _low_level_execute_command() done: rc=0, stdout=, stderr= 11081 1726773080.99101: _low_level_execute_command(): starting 11081 1726773080.99110: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/AnsiballZ_slurp.py && sleep 0' 11081 1726773081.14132: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11081 1726773081.15251: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11081 1726773081.15263: stderr chunk (state=3): >>><<< 11081 1726773081.15266: stdout chunk (state=3): >>><<< 11081 1726773081.15284: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 11081 1726773081.15317: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11081 1726773081.15335: _low_level_execute_command(): starting 11081 1726773081.15342: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.895905-11081-267784801451417/ > /dev/null 2>&1 && sleep 0' 11081 1726773081.18253: stderr chunk (state=2): >>><<< 11081 1726773081.18263: stdout chunk (state=2): >>><<< 11081 1726773081.18280: _low_level_execute_command() done: rc=0, stdout=, stderr= 11081 1726773081.18288: handler run complete 11081 1726773081.18316: attempt loop complete, returning result 11081 1726773081.18330: _execute() done 11081 1726773081.18332: dumping result to json 11081 1726773081.18336: done dumping result, returning 11081 1726773081.18354: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [12a3200b-1e9d-1dbd-cc52-0000000005ef] 11081 1726773081.18368: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ef 11081 1726773081.18404: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005ef 11081 1726773081.18465: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773081.18570: no more pending results, returning what we have 8119 1726773081.18575: results queue empty 8119 1726773081.18577: checking for any_errors_fatal 8119 1726773081.18585: done checking for any_errors_fatal 8119 1726773081.18587: checking for max_fail_percentage 8119 1726773081.18590: done checking for max_fail_percentage 8119 1726773081.18592: checking to see if all hosts have failed and the running result is not ok 8119 1726773081.18594: done checking to see if all hosts have failed 8119 1726773081.18596: getting the remaining hosts for this loop 8119 1726773081.18599: done getting the remaining hosts for this loop 8119 1726773081.18607: building list of next tasks for hosts 8119 1726773081.18610: getting the next task for host managed_node2 8119 1726773081.18617: done getting next task for host managed_node2 8119 1726773081.18621: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773081.18625: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.18628: done building task lists 8119 1726773081.18630: counting tasks in each state of execution 8119 1726773081.18634: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773081.18636: advancing hosts in ITERATING_TASKS 8119 1726773081.18638: starting to advance hosts 8119 1726773081.18640: getting the next task for host managed_node2 8119 1726773081.18644: done getting next task for host managed_node2 8119 1726773081.18647: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773081.18650: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.18652: done advancing hosts to next task 8119 1726773081.18667: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773081.18671: getting variables 8119 1726773081.18674: in VariableManager get_vars() 8119 1726773081.18708: Calling all_inventory to load vars for managed_node2 8119 1726773081.18714: Calling groups_inventory to load vars for managed_node2 8119 1726773081.18718: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773081.18740: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.18750: Calling all_plugins_play to load vars for managed_node2 8119 1726773081.18761: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.18769: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773081.18780: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.18795: Calling groups_plugins_play to load vars for managed_node2 8119 1726773081.18818: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.18849: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.18864: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.19129: done with get_vars() 8119 1726773081.19143: done getting variables 8119 1726773081.19150: sending task start callback, copying the task so we can template it temporarily 8119 1726773081.19153: done copying, going to template now 8119 1726773081.19155: done templating 8119 1726773081.19157: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.352) 0:01:15.748 **** 8119 1726773081.19177: sending task start callback 8119 1726773081.19179: entering _queue_task() for managed_node2/set_fact 8119 1726773081.19322: worker is 1 (out of 1 available) 8119 1726773081.19357: exiting _queue_task() for managed_node2/set_fact 8119 1726773081.19427: done queuing things up, now waiting for results queue to drain 8119 1726773081.19433: waiting for pending results... 11099 1726773081.19661: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11099 1726773081.19728: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f0 11099 1726773081.19780: calling self._execute() 11099 1726773081.21613: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11099 1726773081.21696: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11099 1726773081.21759: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11099 1726773081.21791: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11099 1726773081.21820: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11099 1726773081.21847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11099 1726773081.21895: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11099 1726773081.21920: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11099 1726773081.21937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11099 1726773081.22020: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11099 1726773081.22038: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11099 1726773081.22052: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11099 1726773081.22372: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11099 1726773081.22406: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11099 1726773081.22417: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11099 1726773081.22429: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11099 1726773081.22436: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11099 1726773081.22530: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11099 1726773081.22547: starting attempt loop 11099 1726773081.22551: running the handler 11099 1726773081.22563: handler run complete 11099 1726773081.22566: attempt loop complete, returning result 11099 1726773081.22569: _execute() done 11099 1726773081.22571: dumping result to json 11099 1726773081.22574: done dumping result, returning 11099 1726773081.22579: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [12a3200b-1e9d-1dbd-cc52-0000000005f0] 11099 1726773081.22587: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f0 11099 1726773081.22623: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f0 11099 1726773081.22627: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8119 1726773081.22762: no more pending results, returning what we have 8119 1726773081.22768: results queue empty 8119 1726773081.22770: checking for any_errors_fatal 8119 1726773081.22774: done checking for any_errors_fatal 8119 1726773081.22776: checking for max_fail_percentage 8119 1726773081.22778: done checking for max_fail_percentage 8119 1726773081.22780: checking to see if all hosts have failed and the running result is not ok 8119 1726773081.22781: done checking to see if all hosts have failed 8119 1726773081.22784: getting the remaining hosts for this loop 8119 1726773081.22787: done getting the remaining hosts for this loop 8119 1726773081.22796: building list of next tasks for hosts 8119 1726773081.22799: getting the next task for host managed_node2 8119 1726773081.22810: done getting next task for host managed_node2 8119 1726773081.22815: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773081.22819: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.22821: done building task lists 8119 1726773081.22823: counting tasks in each state of execution 8119 1726773081.22828: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773081.22830: advancing hosts in ITERATING_TASKS 8119 1726773081.22832: starting to advance hosts 8119 1726773081.22834: getting the next task for host managed_node2 8119 1726773081.22837: done getting next task for host managed_node2 8119 1726773081.22840: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773081.22843: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.22844: done advancing hosts to next task 8119 1726773081.22858: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773081.22862: getting variables 8119 1726773081.22864: in VariableManager get_vars() 8119 1726773081.22900: Calling all_inventory to load vars for managed_node2 8119 1726773081.22906: Calling groups_inventory to load vars for managed_node2 8119 1726773081.22912: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773081.22938: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.22953: Calling all_plugins_play to load vars for managed_node2 8119 1726773081.22968: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.22982: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773081.23000: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.23013: Calling groups_plugins_play to load vars for managed_node2 8119 1726773081.23028: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.23048: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.23062: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.23291: done with get_vars() 8119 1726773081.23301: done getting variables 8119 1726773081.23306: sending task start callback, copying the task so we can template it temporarily 8119 1726773081.23310: done copying, going to template now 8119 1726773081.23312: done templating 8119 1726773081.23313: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.041) 0:01:15.789 **** 8119 1726773081.23328: sending task start callback 8119 1726773081.23330: entering _queue_task() for managed_node2/copy 8119 1726773081.23450: worker is 1 (out of 1 available) 8119 1726773081.23487: exiting _queue_task() for managed_node2/copy 8119 1726773081.23562: done queuing things up, now waiting for results queue to drain 8119 1726773081.23568: waiting for pending results... 11101 1726773081.23628: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11101 1726773081.23677: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f1 11101 1726773081.23726: calling self._execute() 11101 1726773081.25445: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11101 1726773081.25528: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11101 1726773081.25581: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11101 1726773081.25611: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11101 1726773081.25636: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11101 1726773081.25666: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11101 1726773081.25713: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11101 1726773081.25737: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11101 1726773081.25753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11101 1726773081.25834: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11101 1726773081.25850: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11101 1726773081.25864: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11101 1726773081.26150: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11101 1726773081.26187: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11101 1726773081.26198: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11101 1726773081.26208: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11101 1726773081.26215: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11101 1726773081.26305: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11101 1726773081.26322: starting attempt loop 11101 1726773081.26324: running the handler 11101 1726773081.26334: _low_level_execute_command(): starting 11101 1726773081.26338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11101 1726773081.28827: stdout chunk (state=2): >>>/root <<< 11101 1726773081.28993: stderr chunk (state=3): >>><<< 11101 1726773081.28998: stdout chunk (state=3): >>><<< 11101 1726773081.29022: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11101 1726773081.29037: _low_level_execute_command(): starting 11101 1726773081.29042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557 `" && echo ansible-tmp-1726773081.2903156-11101-160588987864557="` echo /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557 `" ) && sleep 0' 11101 1726773081.31835: stdout chunk (state=2): >>>ansible-tmp-1726773081.2903156-11101-160588987864557=/root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557 <<< 11101 1726773081.32042: stderr chunk (state=3): >>><<< 11101 1726773081.32048: stdout chunk (state=3): >>><<< 11101 1726773081.32066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773081.2903156-11101-160588987864557=/root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557 , stderr= 11101 1726773081.32207: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11101 1726773081.32263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_stat.py 11101 1726773081.32539: Sending initial data 11101 1726773081.32555: Sent initial data (152 bytes) 11101 1726773081.35004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpnjpf35rf /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_stat.py <<< 11101 1726773081.36056: stderr chunk (state=3): >>><<< 11101 1726773081.36062: stdout chunk (state=3): >>><<< 11101 1726773081.36088: done transferring module to remote 11101 1726773081.36106: _low_level_execute_command(): starting 11101 1726773081.36112: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/ /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_stat.py && sleep 0' 11101 1726773081.38794: stderr chunk (state=2): >>><<< 11101 1726773081.38807: stdout chunk (state=2): >>><<< 11101 1726773081.38826: _low_level_execute_command() done: rc=0, stdout=, stderr= 11101 1726773081.38830: _low_level_execute_command(): starting 11101 1726773081.38838: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_stat.py && sleep 0' 11101 1726773081.54455: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773081.1390235, "mtime": 1726773073.316735, "ctime": 1726773073.316735, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11101 1726773081.55557: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11101 1726773081.55566: stdout chunk (state=3): >>><<< 11101 1726773081.55577: stderr chunk (state=3): >>><<< 11101 1726773081.55601: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773081.1390235, "mtime": 1726773073.316735, "ctime": 1726773073.316735, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11101 1726773081.55677: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11101 1726773081.55792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11101 1726773081.55847: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_file.py 11101 1726773081.56397: Sending initial data 11101 1726773081.56420: Sent initial data (152 bytes) 11101 1726773081.58826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpt95lup0m /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_file.py <<< 11101 1726773081.59857: stderr chunk (state=3): >>><<< 11101 1726773081.59862: stdout chunk (state=3): >>><<< 11101 1726773081.59886: done transferring module to remote 11101 1726773081.59900: _low_level_execute_command(): starting 11101 1726773081.59905: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/ /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_file.py && sleep 0' 11101 1726773081.62453: stderr chunk (state=2): >>><<< 11101 1726773081.62468: stdout chunk (state=2): >>><<< 11101 1726773081.62491: _low_level_execute_command() done: rc=0, stdout=, stderr= 11101 1726773081.62496: _low_level_execute_command(): starting 11101 1726773081.62503: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/AnsiballZ_file.py && sleep 0' 11101 1726773081.78405: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp4tne2ycg", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11101 1726773081.79261: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11101 1726773081.79271: stdout chunk (state=3): >>><<< 11101 1726773081.79287: stderr chunk (state=3): >>><<< 11101 1726773081.79309: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp4tne2ycg", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11101 1726773081.79358: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp4tne2ycg', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11101 1726773081.79376: _low_level_execute_command(): starting 11101 1726773081.79386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773081.2903156-11101-160588987864557/ > /dev/null 2>&1 && sleep 0' 11101 1726773081.82287: stderr chunk (state=2): >>><<< 11101 1726773081.82300: stdout chunk (state=2): >>><<< 11101 1726773081.82325: _low_level_execute_command() done: rc=0, stdout=, stderr= 11101 1726773081.82338: handler run complete 11101 1726773081.82388: attempt loop complete, returning result 11101 1726773081.82408: _execute() done 11101 1726773081.82412: dumping result to json 11101 1726773081.82419: done dumping result, returning 11101 1726773081.82435: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [12a3200b-1e9d-1dbd-cc52-0000000005f1] 11101 1726773081.82454: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f1 11101 1726773081.82552: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f1 11101 1726773081.82557: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8119 1726773081.82953: no more pending results, returning what we have 8119 1726773081.82960: results queue empty 8119 1726773081.82966: checking for any_errors_fatal 8119 1726773081.82970: done checking for any_errors_fatal 8119 1726773081.82972: checking for max_fail_percentage 8119 1726773081.82974: done checking for max_fail_percentage 8119 1726773081.82975: checking to see if all hosts have failed and the running result is not ok 8119 1726773081.82977: done checking to see if all hosts have failed 8119 1726773081.82978: getting the remaining hosts for this loop 8119 1726773081.82980: done getting the remaining hosts for this loop 8119 1726773081.82987: building list of next tasks for hosts 8119 1726773081.82989: getting the next task for host managed_node2 8119 1726773081.82995: done getting next task for host managed_node2 8119 1726773081.82998: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773081.83001: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.83002: done building task lists 8119 1726773081.83003: counting tasks in each state of execution 8119 1726773081.83006: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773081.83008: advancing hosts in ITERATING_TASKS 8119 1726773081.83010: starting to advance hosts 8119 1726773081.83011: getting the next task for host managed_node2 8119 1726773081.83014: done getting next task for host managed_node2 8119 1726773081.83015: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773081.83017: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773081.83019: done advancing hosts to next task 8119 1726773081.83030: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773081.83033: getting variables 8119 1726773081.83035: in VariableManager get_vars() 8119 1726773081.83063: Calling all_inventory to load vars for managed_node2 8119 1726773081.83067: Calling groups_inventory to load vars for managed_node2 8119 1726773081.83070: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773081.83094: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83106: Calling all_plugins_play to load vars for managed_node2 8119 1726773081.83121: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83132: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773081.83143: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83149: Calling groups_plugins_play to load vars for managed_node2 8119 1726773081.83159: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83181: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83200: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773081.83419: done with get_vars() 8119 1726773081.83430: done getting variables 8119 1726773081.83434: sending task start callback, copying the task so we can template it temporarily 8119 1726773081.83436: done copying, going to template now 8119 1726773081.83438: done templating 8119 1726773081.83439: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.601) 0:01:16.390 **** 8119 1726773081.83454: sending task start callback 8119 1726773081.83456: entering _queue_task() for managed_node2/copy 8119 1726773081.83571: worker is 1 (out of 1 available) 8119 1726773081.83611: exiting _queue_task() for managed_node2/copy 8119 1726773081.83682: done queuing things up, now waiting for results queue to drain 8119 1726773081.83689: waiting for pending results... 11132 1726773081.83756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11132 1726773081.83808: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f2 11132 1726773081.83856: calling self._execute() 11132 1726773081.85743: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11132 1726773081.85849: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11132 1726773081.85908: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11132 1726773081.85946: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11132 1726773081.85986: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11132 1726773081.86027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11132 1726773081.86092: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11132 1726773081.86127: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11132 1726773081.86153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11132 1726773081.86263: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11132 1726773081.86294: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11132 1726773081.86321: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11132 1726773081.86611: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11132 1726773081.86659: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11132 1726773081.86672: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11132 1726773081.86685: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11132 1726773081.86691: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11132 1726773081.86798: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11132 1726773081.86816: starting attempt loop 11132 1726773081.86818: running the handler 11132 1726773081.86826: _low_level_execute_command(): starting 11132 1726773081.86830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11132 1726773081.89275: stdout chunk (state=2): >>>/root <<< 11132 1726773081.89406: stderr chunk (state=3): >>><<< 11132 1726773081.89414: stdout chunk (state=3): >>><<< 11132 1726773081.89438: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11132 1726773081.89454: _low_level_execute_command(): starting 11132 1726773081.89461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803 `" && echo ansible-tmp-1726773081.894479-11132-8370340408803="` echo /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803 `" ) && sleep 0' 11132 1726773081.92509: stdout chunk (state=2): >>>ansible-tmp-1726773081.894479-11132-8370340408803=/root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803 <<< 11132 1726773081.92890: stderr chunk (state=3): >>><<< 11132 1726773081.92898: stdout chunk (state=3): >>><<< 11132 1726773081.92926: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773081.894479-11132-8370340408803=/root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803 , stderr= 11132 1726773081.93120: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11132 1726773081.93187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_stat.py 11132 1726773081.93899: Sending initial data 11132 1726773081.93913: Sent initial data (149 bytes) 11132 1726773081.96300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpbpbdung3 /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_stat.py <<< 11132 1726773081.97488: stderr chunk (state=3): >>><<< 11132 1726773081.97497: stdout chunk (state=3): >>><<< 11132 1726773081.97520: done transferring module to remote 11132 1726773081.97535: _low_level_execute_command(): starting 11132 1726773081.97539: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/ /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_stat.py && sleep 0' 11132 1726773082.00050: stderr chunk (state=2): >>><<< 11132 1726773082.00060: stdout chunk (state=2): >>><<< 11132 1726773082.00079: _low_level_execute_command() done: rc=0, stdout=, stderr= 11132 1726773082.00085: _low_level_execute_command(): starting 11132 1726773082.00094: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_stat.py && sleep 0' 11132 1726773082.15418: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773071.2800772, "mtime": 1726773073.316735, "ctime": 1726773073.316735, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11132 1726773082.16471: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11132 1726773082.16528: stderr chunk (state=3): >>><<< 11132 1726773082.16537: stdout chunk (state=3): >>><<< 11132 1726773082.16561: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773071.2800772, "mtime": 1726773073.316735, "ctime": 1726773073.316735, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11132 1726773082.16643: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11132 1726773082.16760: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11132 1726773082.16815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_file.py 11132 1726773082.17514: Sending initial data 11132 1726773082.17528: Sent initial data (149 bytes) 11132 1726773082.19922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpgj99bw7o /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_file.py <<< 11132 1726773082.21112: stderr chunk (state=3): >>><<< 11132 1726773082.21120: stdout chunk (state=3): >>><<< 11132 1726773082.21146: done transferring module to remote 11132 1726773082.21161: _low_level_execute_command(): starting 11132 1726773082.21167: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/ /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_file.py && sleep 0' 11132 1726773082.23856: stderr chunk (state=2): >>><<< 11132 1726773082.23867: stdout chunk (state=2): >>><<< 11132 1726773082.23886: _low_level_execute_command() done: rc=0, stdout=, stderr= 11132 1726773082.23890: _low_level_execute_command(): starting 11132 1726773082.23896: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/AnsiballZ_file.py && sleep 0' 11132 1726773082.39711: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpna7hz0to", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11132 1726773082.41100: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11132 1726773082.41115: stdout chunk (state=3): >>><<< 11132 1726773082.41127: stderr chunk (state=3): >>><<< 11132 1726773082.41147: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpna7hz0to", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11132 1726773082.41194: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpna7hz0to', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11132 1726773082.41219: _low_level_execute_command(): starting 11132 1726773082.41228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773081.894479-11132-8370340408803/ > /dev/null 2>&1 && sleep 0' 11132 1726773082.44177: stderr chunk (state=2): >>><<< 11132 1726773082.44193: stdout chunk (state=2): >>><<< 11132 1726773082.44220: _low_level_execute_command() done: rc=0, stdout=, stderr= 11132 1726773082.44235: handler run complete 11132 1726773082.44281: attempt loop complete, returning result 11132 1726773082.44300: _execute() done 11132 1726773082.44303: dumping result to json 11132 1726773082.44309: done dumping result, returning 11132 1726773082.44326: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [12a3200b-1e9d-1dbd-cc52-0000000005f2] 11132 1726773082.44342: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f2 11132 1726773082.44399: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f2 11132 1726773082.44405: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8119 1726773082.44776: no more pending results, returning what we have 8119 1726773082.44784: results queue empty 8119 1726773082.44788: checking for any_errors_fatal 8119 1726773082.44794: done checking for any_errors_fatal 8119 1726773082.44796: checking for max_fail_percentage 8119 1726773082.44800: done checking for max_fail_percentage 8119 1726773082.44802: checking to see if all hosts have failed and the running result is not ok 8119 1726773082.44804: done checking to see if all hosts have failed 8119 1726773082.44806: getting the remaining hosts for this loop 8119 1726773082.44809: done getting the remaining hosts for this loop 8119 1726773082.44817: building list of next tasks for hosts 8119 1726773082.44820: getting the next task for host managed_node2 8119 1726773082.44828: done getting next task for host managed_node2 8119 1726773082.44833: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773082.44837: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773082.44840: done building task lists 8119 1726773082.44842: counting tasks in each state of execution 8119 1726773082.44846: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773082.44848: advancing hosts in ITERATING_TASKS 8119 1726773082.44850: starting to advance hosts 8119 1726773082.44852: getting the next task for host managed_node2 8119 1726773082.44856: done getting next task for host managed_node2 8119 1726773082.44859: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773082.44862: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773082.44865: done advancing hosts to next task 8119 1726773082.44912: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773082.44919: getting variables 8119 1726773082.44922: in VariableManager get_vars() 8119 1726773082.44958: Calling all_inventory to load vars for managed_node2 8119 1726773082.44965: Calling groups_inventory to load vars for managed_node2 8119 1726773082.44968: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773082.44999: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45014: Calling all_plugins_play to load vars for managed_node2 8119 1726773082.45030: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45044: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773082.45061: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45070: Calling groups_plugins_play to load vars for managed_node2 8119 1726773082.45088: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45120: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45146: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.45417: done with get_vars() 8119 1726773082.45428: done getting variables 8119 1726773082.45435: sending task start callback, copying the task so we can template it temporarily 8119 1726773082.45437: done copying, going to template now 8119 1726773082.45439: done templating 8119 1726773082.45440: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.620) 0:01:17.010 **** 8119 1726773082.45456: sending task start callback 8119 1726773082.45458: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773082.45596: worker is 1 (out of 1 available) 8119 1726773082.45627: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773082.45690: done queuing things up, now waiting for results queue to drain 8119 1726773082.45696: waiting for pending results... 11158 1726773082.45872: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11158 1726773082.45932: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f3 11158 1726773082.45978: calling self._execute() 11158 1726773082.47847: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11158 1726773082.47974: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11158 1726773082.48045: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11158 1726773082.48073: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11158 1726773082.48111: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11158 1726773082.48149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11158 1726773082.48206: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11158 1726773082.48240: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11158 1726773082.48266: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11158 1726773082.48382: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11158 1726773082.48411: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11158 1726773082.48433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11158 1726773082.48822: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11158 1726773082.48872: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11158 1726773082.48886: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11158 1726773082.48897: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11158 1726773082.48903: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11158 1726773082.48990: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11158 1726773082.49004: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11158 1726773082.49030: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11158 1726773082.49045: starting attempt loop 11158 1726773082.49047: running the handler 11158 1726773082.49056: _low_level_execute_command(): starting 11158 1726773082.49060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11158 1726773082.51532: stdout chunk (state=2): >>>/root <<< 11158 1726773082.51758: stderr chunk (state=3): >>><<< 11158 1726773082.51765: stdout chunk (state=3): >>><<< 11158 1726773082.51792: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11158 1726773082.51812: _low_level_execute_command(): starting 11158 1726773082.51820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760 `" && echo ansible-tmp-1726773082.5180502-11158-786069430760="` echo /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760 `" ) && sleep 0' 11158 1726773082.54829: stdout chunk (state=2): >>>ansible-tmp-1726773082.5180502-11158-786069430760=/root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760 <<< 11158 1726773082.54970: stderr chunk (state=3): >>><<< 11158 1726773082.54978: stdout chunk (state=3): >>><<< 11158 1726773082.55003: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773082.5180502-11158-786069430760=/root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760 , stderr= 11158 1726773082.55104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 11158 1726773082.55175: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/AnsiballZ_kernel_settings_get_config.py 11158 1726773082.55555: Sending initial data 11158 1726773082.55570: Sent initial data (171 bytes) 11158 1726773082.58269: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpr5t2uc8e /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/AnsiballZ_kernel_settings_get_config.py <<< 11158 1726773082.59401: stderr chunk (state=3): >>><<< 11158 1726773082.59407: stdout chunk (state=3): >>><<< 11158 1726773082.59435: done transferring module to remote 11158 1726773082.59451: _low_level_execute_command(): starting 11158 1726773082.59457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/ /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11158 1726773082.62145: stderr chunk (state=2): >>><<< 11158 1726773082.62159: stdout chunk (state=2): >>><<< 11158 1726773082.62177: _low_level_execute_command() done: rc=0, stdout=, stderr= 11158 1726773082.62181: _low_level_execute_command(): starting 11158 1726773082.62189: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11158 1726773082.77206: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11158 1726773082.78193: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11158 1726773082.78234: stderr chunk (state=3): >>><<< 11158 1726773082.78239: stdout chunk (state=3): >>><<< 11158 1726773082.78258: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 11158 1726773082.78287: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11158 1726773082.78304: _low_level_execute_command(): starting 11158 1726773082.78311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773082.5180502-11158-786069430760/ > /dev/null 2>&1 && sleep 0' 11158 1726773082.80974: stderr chunk (state=2): >>><<< 11158 1726773082.80987: stdout chunk (state=2): >>><<< 11158 1726773082.81011: _low_level_execute_command() done: rc=0, stdout=, stderr= 11158 1726773082.81019: handler run complete 11158 1726773082.81045: attempt loop complete, returning result 11158 1726773082.81061: _execute() done 11158 1726773082.81064: dumping result to json 11158 1726773082.81067: done dumping result, returning 11158 1726773082.81080: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [12a3200b-1e9d-1dbd-cc52-0000000005f3] 11158 1726773082.81095: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f3 11158 1726773082.81136: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f3 11158 1726773082.81140: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8119 1726773082.81402: no more pending results, returning what we have 8119 1726773082.81408: results queue empty 8119 1726773082.81411: checking for any_errors_fatal 8119 1726773082.81415: done checking for any_errors_fatal 8119 1726773082.81416: checking for max_fail_percentage 8119 1726773082.81419: done checking for max_fail_percentage 8119 1726773082.81420: checking to see if all hosts have failed and the running result is not ok 8119 1726773082.81421: done checking to see if all hosts have failed 8119 1726773082.81423: getting the remaining hosts for this loop 8119 1726773082.81424: done getting the remaining hosts for this loop 8119 1726773082.81430: building list of next tasks for hosts 8119 1726773082.81432: getting the next task for host managed_node2 8119 1726773082.81437: done getting next task for host managed_node2 8119 1726773082.81440: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773082.81443: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773082.81445: done building task lists 8119 1726773082.81446: counting tasks in each state of execution 8119 1726773082.81448: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773082.81450: advancing hosts in ITERATING_TASKS 8119 1726773082.81451: starting to advance hosts 8119 1726773082.81453: getting the next task for host managed_node2 8119 1726773082.81456: done getting next task for host managed_node2 8119 1726773082.81458: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773082.81459: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773082.81461: done advancing hosts to next task 8119 1726773082.81472: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773082.81475: getting variables 8119 1726773082.81477: in VariableManager get_vars() 8119 1726773082.81510: Calling all_inventory to load vars for managed_node2 8119 1726773082.81516: Calling groups_inventory to load vars for managed_node2 8119 1726773082.81518: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773082.81540: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81550: Calling all_plugins_play to load vars for managed_node2 8119 1726773082.81560: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81569: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773082.81579: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81587: Calling groups_plugins_play to load vars for managed_node2 8119 1726773082.81598: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81623: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81638: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773082.81866: done with get_vars() 8119 1726773082.81876: done getting variables 8119 1726773082.81881: sending task start callback, copying the task so we can template it temporarily 8119 1726773082.81885: done copying, going to template now 8119 1726773082.81888: done templating 8119 1726773082.81889: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.364) 0:01:17.375 **** 8119 1726773082.81905: sending task start callback 8119 1726773082.81906: entering _queue_task() for managed_node2/template 8119 1726773082.82025: worker is 1 (out of 1 available) 8119 1726773082.82064: exiting _queue_task() for managed_node2/template 8119 1726773082.82136: done queuing things up, now waiting for results queue to drain 8119 1726773082.82141: waiting for pending results... 11178 1726773082.82210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11178 1726773082.82259: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f4 11178 1726773082.82306: calling self._execute() 11178 1726773082.84369: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11178 1726773082.84472: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11178 1726773082.84531: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11178 1726773082.84557: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11178 1726773082.84586: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11178 1726773082.84620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11178 1726773082.84661: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11178 1726773082.84685: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11178 1726773082.84706: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11178 1726773082.84786: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11178 1726773082.84803: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11178 1726773082.84822: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11178 1726773082.85243: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11178 1726773082.85275: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11178 1726773082.85288: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11178 1726773082.85302: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11178 1726773082.85312: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11178 1726773082.85403: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11178 1726773082.85425: starting attempt loop 11178 1726773082.85428: running the handler 11178 1726773082.85435: _low_level_execute_command(): starting 11178 1726773082.85439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11178 1726773082.87894: stdout chunk (state=2): >>>/root <<< 11178 1726773082.88010: stderr chunk (state=3): >>><<< 11178 1726773082.88015: stdout chunk (state=3): >>><<< 11178 1726773082.88040: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11178 1726773082.88054: _low_level_execute_command(): starting 11178 1726773082.88059: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200 `" && echo ansible-tmp-1726773082.8804872-11178-243347237094200="` echo /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200 `" ) && sleep 0' 11178 1726773082.90832: stdout chunk (state=2): >>>ansible-tmp-1726773082.8804872-11178-243347237094200=/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200 <<< 11178 1726773082.90953: stderr chunk (state=3): >>><<< 11178 1726773082.90958: stdout chunk (state=3): >>><<< 11178 1726773082.90975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773082.8804872-11178-243347237094200=/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200 , stderr= 11178 1726773082.90999: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11178 1726773082.91017: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11178 1726773082.92470: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92475: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.92478: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.92480: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.92482: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.92487: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.92489: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.92491: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.92494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.92512: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92515: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.92517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.92767: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92773: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.92776: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.92779: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.92781: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.92784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.92786: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.92788: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.92790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.92806: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92808: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.92811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.92839: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92842: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.92844: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.92846: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.92848: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.92849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.92851: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.92853: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.92855: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.92868: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.92872: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.92875: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93026: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93031: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.93033: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.93035: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.93037: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.93039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93041: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.93042: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.93044: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.93057: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93059: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.93061: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93296: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93301: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.93303: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.93305: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.93307: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.93309: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93311: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.93313: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.93315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.93331: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93335: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.93338: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93369: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93373: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11178 1726773082.93375: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11178 1726773082.93377: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11178 1726773082.93379: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11178 1726773082.93380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.93382: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11178 1726773082.93386: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11178 1726773082.93388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11178 1726773082.93402: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11178 1726773082.93404: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11178 1726773082.93406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11178 1726773082.94555: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11178 1726773082.94634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11178 1726773082.94676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_stat.py 11178 1726773082.95021: Sending initial data 11178 1726773082.95037: Sent initial data (152 bytes) 11178 1726773082.97533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp4_pn8ve2 /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_stat.py <<< 11178 1726773082.98543: stderr chunk (state=3): >>><<< 11178 1726773082.98555: stdout chunk (state=3): >>><<< 11178 1726773082.98575: done transferring module to remote 11178 1726773082.98592: _low_level_execute_command(): starting 11178 1726773082.98598: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/ /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_stat.py && sleep 0' 11178 1726773083.01136: stderr chunk (state=2): >>><<< 11178 1726773083.01147: stdout chunk (state=2): >>><<< 11178 1726773083.01169: _low_level_execute_command() done: rc=0, stdout=, stderr= 11178 1726773083.01173: _low_level_execute_command(): starting 11178 1726773083.01179: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_stat.py && sleep 0' 11178 1726773083.16593: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726773073.3037353, "mtime": 1726773072.4338222, "ctime": 1726773072.7046545, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "4206896328", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11178 1726773083.17635: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11178 1726773083.17671: stderr chunk (state=3): >>><<< 11178 1726773083.17677: stdout chunk (state=3): >>><<< 11178 1726773083.17699: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726773073.3037353, "mtime": 1726773072.4338222, "ctime": 1726773072.7046545, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "4206896328", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11178 1726773083.17759: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11178 1726773083.18079: Sending initial data 11178 1726773083.18096: Sent initial data (160 bytes) 11178 1726773083.20818: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptqvfq9rd/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source <<< 11178 1726773083.21133: stderr chunk (state=3): >>><<< 11178 1726773083.21138: stdout chunk (state=3): >>><<< 11178 1726773083.21162: _low_level_execute_command(): starting 11178 1726773083.21168: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/ /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source && sleep 0' 11178 1726773083.23902: stderr chunk (state=2): >>><<< 11178 1726773083.23913: stdout chunk (state=2): >>><<< 11178 1726773083.23931: _low_level_execute_command() done: rc=0, stdout=, stderr= 11178 1726773083.24039: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 11178 1726773083.24089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_copy.py 11178 1726773083.24375: Sending initial data 11178 1726773083.24393: Sent initial data (152 bytes) 11178 1726773083.26874: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptgxi71np /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_copy.py <<< 11178 1726773083.27900: stderr chunk (state=3): >>><<< 11178 1726773083.27906: stdout chunk (state=3): >>><<< 11178 1726773083.27931: done transferring module to remote 11178 1726773083.27947: _low_level_execute_command(): starting 11178 1726773083.27952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/ /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_copy.py && sleep 0' 11178 1726773083.30539: stderr chunk (state=2): >>><<< 11178 1726773083.30550: stdout chunk (state=2): >>><<< 11178 1726773083.30569: _low_level_execute_command() done: rc=0, stdout=, stderr= 11178 1726773083.30574: _low_level_execute_command(): starting 11178 1726773083.30581: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/AnsiballZ_copy.py && sleep 0' 11178 1726773083.46454: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 11178 1726773083.48396: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11178 1726773083.48449: stderr chunk (state=3): >>><<< 11178 1726773083.48457: stdout chunk (state=3): >>><<< 11178 1726773083.48486: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11178 1726773083.48533: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3107bf46f5c007ef178305bb243dd11664f9bf35', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11178 1726773083.48576: _low_level_execute_command(): starting 11178 1726773083.48588: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/ > /dev/null 2>&1 && sleep 0' 11178 1726773083.51600: stderr chunk (state=2): >>><<< 11178 1726773083.51619: stdout chunk (state=2): >>><<< 11178 1726773083.51647: _low_level_execute_command() done: rc=0, stdout=, stderr= 11178 1726773083.51681: handler run complete 11178 1726773083.51720: attempt loop complete, returning result 11178 1726773083.51725: _execute() done 11178 1726773083.51727: dumping result to json 11178 1726773083.51731: done dumping result, returning 11178 1726773083.51746: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [12a3200b-1e9d-1dbd-cc52-0000000005f4] 11178 1726773083.51762: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f4 11178 1726773083.51801: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f4 11178 1726773083.51870: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "1fd7f2202613b516022cf613601e26bd", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726773082.8804872-11178-243347237094200/source", "state": "file", "uid": 0 } 8119 1726773083.52112: no more pending results, returning what we have 8119 1726773083.52119: results queue empty 8119 1726773083.52121: checking for any_errors_fatal 8119 1726773083.52127: done checking for any_errors_fatal 8119 1726773083.52129: checking for max_fail_percentage 8119 1726773083.52132: done checking for max_fail_percentage 8119 1726773083.52133: checking to see if all hosts have failed and the running result is not ok 8119 1726773083.52135: done checking to see if all hosts have failed 8119 1726773083.52137: getting the remaining hosts for this loop 8119 1726773083.52140: done getting the remaining hosts for this loop 8119 1726773083.52147: building list of next tasks for hosts 8119 1726773083.52150: getting the next task for host managed_node2 8119 1726773083.52157: done getting next task for host managed_node2 8119 1726773083.52162: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773083.52165: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773083.52166: done building task lists 8119 1726773083.52168: counting tasks in each state of execution 8119 1726773083.52170: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773083.52172: advancing hosts in ITERATING_TASKS 8119 1726773083.52173: starting to advance hosts 8119 1726773083.52175: getting the next task for host managed_node2 8119 1726773083.52178: done getting next task for host managed_node2 8119 1726773083.52180: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773083.52182: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773083.52188: done advancing hosts to next task 8119 1726773083.52201: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773083.52204: getting variables 8119 1726773083.52206: in VariableManager get_vars() 8119 1726773083.52235: Calling all_inventory to load vars for managed_node2 8119 1726773083.52238: Calling groups_inventory to load vars for managed_node2 8119 1726773083.52241: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773083.52263: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52276: Calling all_plugins_play to load vars for managed_node2 8119 1726773083.52292: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52303: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773083.52317: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52324: Calling groups_plugins_play to load vars for managed_node2 8119 1726773083.52334: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52351: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52365: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.52575: done with get_vars() 8119 1726773083.52588: done getting variables 8119 1726773083.52594: sending task start callback, copying the task so we can template it temporarily 8119 1726773083.52595: done copying, going to template now 8119 1726773083.52597: done templating 8119 1726773083.52599: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:23 -0400 (0:00:00.707) 0:01:18.082 **** 8119 1726773083.52619: sending task start callback 8119 1726773083.52621: entering _queue_task() for managed_node2/service 8119 1726773083.52746: worker is 1 (out of 1 available) 8119 1726773083.52789: exiting _queue_task() for managed_node2/service 8119 1726773083.52864: done queuing things up, now waiting for results queue to drain 8119 1726773083.52869: waiting for pending results... 11211 1726773083.52924: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11211 1726773083.52974: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f5 11211 1726773083.55128: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11211 1726773083.55233: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11211 1726773083.55295: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11211 1726773083.55333: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11211 1726773083.55372: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11211 1726773083.55406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11211 1726773083.55461: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11211 1726773083.55491: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11211 1726773083.55511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11211 1726773083.55592: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11211 1726773083.55614: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11211 1726773083.55629: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11211 1726773083.55788: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11211 1726773083.55792: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11211 1726773083.55795: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11211 1726773083.55797: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11211 1726773083.55798: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11211 1726773083.55800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.55802: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11211 1726773083.55804: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11211 1726773083.55806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11211 1726773083.55821: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11211 1726773083.55824: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11211 1726773083.55827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.56018: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11211 1726773083.56026: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11211 1726773083.56030: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11211 1726773083.56034: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11211 1726773083.56037: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11211 1726773083.56040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.56044: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11211 1726773083.56047: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11211 1726773083.56050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11211 1726773083.56079: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11211 1726773083.56086: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11211 1726773083.56090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.56385: when evaluation is False, skipping this task 11211 1726773083.56435: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11211 1726773083.56441: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11211 1726773083.56445: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11211 1726773083.56448: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11211 1726773083.56451: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11211 1726773083.56454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.56457: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11211 1726773083.56460: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11211 1726773083.56463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11211 1726773083.56494: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11211 1726773083.56500: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11211 1726773083.56503: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11211 1726773083.56602: dumping result to json 11211 1726773083.56611: done dumping result, returning 11211 1726773083.56616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [12a3200b-1e9d-1dbd-cc52-0000000005f5] 11211 1726773083.56623: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f5 11211 1726773083.56625: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f5 11211 1726773083.56627: WORKER PROCESS EXITING skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "item": "tuned", "skip_reason": "Conditional result was False" } 8119 1726773083.56865: no more pending results, returning what we have 8119 1726773083.56872: results queue empty 8119 1726773083.56874: checking for any_errors_fatal 8119 1726773083.56884: done checking for any_errors_fatal 8119 1726773083.56887: checking for max_fail_percentage 8119 1726773083.56889: done checking for max_fail_percentage 8119 1726773083.56890: checking to see if all hosts have failed and the running result is not ok 8119 1726773083.56892: done checking to see if all hosts have failed 8119 1726773083.56893: getting the remaining hosts for this loop 8119 1726773083.56895: done getting the remaining hosts for this loop 8119 1726773083.56900: building list of next tasks for hosts 8119 1726773083.56902: getting the next task for host managed_node2 8119 1726773083.56907: done getting next task for host managed_node2 8119 1726773083.56913: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773083.56916: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773083.56918: done building task lists 8119 1726773083.56919: counting tasks in each state of execution 8119 1726773083.56922: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773083.56923: advancing hosts in ITERATING_TASKS 8119 1726773083.56924: starting to advance hosts 8119 1726773083.56926: getting the next task for host managed_node2 8119 1726773083.56928: done getting next task for host managed_node2 8119 1726773083.56930: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773083.56932: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773083.56934: done advancing hosts to next task 8119 1726773083.56945: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773083.56948: getting variables 8119 1726773083.56951: in VariableManager get_vars() 8119 1726773083.56976: Calling all_inventory to load vars for managed_node2 8119 1726773083.56980: Calling groups_inventory to load vars for managed_node2 8119 1726773083.56982: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773083.57008: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57022: Calling all_plugins_play to load vars for managed_node2 8119 1726773083.57033: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57043: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773083.57053: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57060: Calling groups_plugins_play to load vars for managed_node2 8119 1726773083.57070: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57091: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57113: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773083.57330: done with get_vars() 8119 1726773083.57342: done getting variables 8119 1726773083.57347: sending task start callback, copying the task so we can template it temporarily 8119 1726773083.57349: done copying, going to template now 8119 1726773083.57351: done templating 8119 1726773083.57352: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:23 -0400 (0:00:00.047) 0:01:18.130 **** 8119 1726773083.57368: sending task start callback 8119 1726773083.57369: entering _queue_task() for managed_node2/command 8119 1726773083.57479: worker is 1 (out of 1 available) 8119 1726773083.57509: exiting _queue_task() for managed_node2/command 8119 1726773083.57568: done queuing things up, now waiting for results queue to drain 8119 1726773083.57571: waiting for pending results... 11214 1726773083.57751: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11214 1726773083.57809: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f6 11214 1726773083.57851: calling self._execute() 11214 1726773083.59913: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11214 1726773083.59996: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11214 1726773083.60051: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11214 1726773083.60077: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11214 1726773083.60111: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11214 1726773083.60145: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11214 1726773083.60188: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11214 1726773083.60226: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11214 1726773083.60245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11214 1726773083.60327: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11214 1726773083.60344: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11214 1726773083.60361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11214 1726773083.61046: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11214 1726773083.61084: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11214 1726773083.61096: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11214 1726773083.61108: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11214 1726773083.61116: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11214 1726773083.61217: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11214 1726773083.61235: starting attempt loop 11214 1726773083.61237: running the handler 11214 1726773083.61245: _low_level_execute_command(): starting 11214 1726773083.61248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11214 1726773083.63804: stdout chunk (state=2): >>>/root <<< 11214 1726773083.63925: stderr chunk (state=3): >>><<< 11214 1726773083.63931: stdout chunk (state=3): >>><<< 11214 1726773083.63954: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11214 1726773083.63968: _low_level_execute_command(): starting 11214 1726773083.63974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145 `" && echo ansible-tmp-1726773083.639624-11214-86450587369145="` echo /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145 `" ) && sleep 0' 11214 1726773083.66865: stdout chunk (state=2): >>>ansible-tmp-1726773083.639624-11214-86450587369145=/root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145 <<< 11214 1726773083.66998: stderr chunk (state=3): >>><<< 11214 1726773083.67004: stdout chunk (state=3): >>><<< 11214 1726773083.67025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773083.639624-11214-86450587369145=/root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145 , stderr= 11214 1726773083.67142: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11214 1726773083.67207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/AnsiballZ_command.py 11214 1726773083.68052: Sending initial data 11214 1726773083.68066: Sent initial data (153 bytes) 11214 1726773083.70575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpfeajjnac /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/AnsiballZ_command.py <<< 11214 1726773083.71616: stderr chunk (state=3): >>><<< 11214 1726773083.71624: stdout chunk (state=3): >>><<< 11214 1726773083.71647: done transferring module to remote 11214 1726773083.71663: _low_level_execute_command(): starting 11214 1726773083.71668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/ /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/AnsiballZ_command.py && sleep 0' 11214 1726773083.74285: stderr chunk (state=2): >>><<< 11214 1726773083.74305: stdout chunk (state=2): >>><<< 11214 1726773083.74330: _low_level_execute_command() done: rc=0, stdout=, stderr= 11214 1726773083.74335: _low_level_execute_command(): starting 11214 1726773083.74341: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/AnsiballZ_command.py && sleep 0' 11214 1726773085.03745: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:23.888962", "end": "2024-09-19 15:11:25.035385", "delta": "0:00:01.146423", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11214 1726773085.04941: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11214 1726773085.05013: stderr chunk (state=3): >>><<< 11214 1726773085.05021: stdout chunk (state=3): >>><<< 11214 1726773085.05049: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:23.888962", "end": "2024-09-19 15:11:25.035385", "delta": "0:00:01.146423", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11214 1726773085.05096: done with _execute_module (command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11214 1726773085.05115: _low_level_execute_command(): starting 11214 1726773085.05123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773083.639624-11214-86450587369145/ > /dev/null 2>&1 && sleep 0' 11214 1726773085.08121: stderr chunk (state=2): >>><<< 11214 1726773085.08140: stdout chunk (state=2): >>><<< 11214 1726773085.08170: _low_level_execute_command() done: rc=0, stdout=, stderr= 11214 1726773085.08180: handler run complete 11214 1726773085.08193: attempt loop complete, returning result 11214 1726773085.08213: _execute() done 11214 1726773085.08216: dumping result to json 11214 1726773085.08222: done dumping result, returning 11214 1726773085.08239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [12a3200b-1e9d-1dbd-cc52-0000000005f6] 11214 1726773085.08257: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f6 11214 1726773085.08305: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f6 11214 1726773085.08313: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.146423", "end": "2024-09-19 15:11:25.035385", "rc": 0, "start": "2024-09-19 15:11:23.888962" } 8119 1726773085.08832: no more pending results, returning what we have 8119 1726773085.08837: results queue empty 8119 1726773085.08839: checking for any_errors_fatal 8119 1726773085.08844: done checking for any_errors_fatal 8119 1726773085.08846: checking for max_fail_percentage 8119 1726773085.08849: done checking for max_fail_percentage 8119 1726773085.08851: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.08853: done checking to see if all hosts have failed 8119 1726773085.08855: getting the remaining hosts for this loop 8119 1726773085.08857: done getting the remaining hosts for this loop 8119 1726773085.08865: building list of next tasks for hosts 8119 1726773085.08868: getting the next task for host managed_node2 8119 1726773085.08875: done getting next task for host managed_node2 8119 1726773085.08880: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773085.08886: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.08889: done building task lists 8119 1726773085.08891: counting tasks in each state of execution 8119 1726773085.08895: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.08898: advancing hosts in ITERATING_TASKS 8119 1726773085.08900: starting to advance hosts 8119 1726773085.08902: getting the next task for host managed_node2 8119 1726773085.08907: done getting next task for host managed_node2 8119 1726773085.08913: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773085.08917: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.08919: done advancing hosts to next task 8119 1726773085.08937: getting variables 8119 1726773085.08941: in VariableManager get_vars() 8119 1726773085.08978: Calling all_inventory to load vars for managed_node2 8119 1726773085.08986: Calling groups_inventory to load vars for managed_node2 8119 1726773085.08990: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.09025: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09042: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.09061: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09075: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.09094: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09105: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.09124: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09152: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09173: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.09529: done with get_vars() 8119 1726773085.09543: done getting variables 8119 1726773085.09550: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.09552: done copying, going to template now 8119 1726773085.09555: done templating 8119 1726773085.09557: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:25 -0400 (0:00:01.522) 0:01:19.652 **** 8119 1726773085.09580: sending task start callback 8119 1726773085.09585: entering _queue_task() for managed_node2/include_tasks 8119 1726773085.09769: worker is 1 (out of 1 available) 8119 1726773085.09854: exiting _queue_task() for managed_node2/include_tasks 8119 1726773085.09935: done queuing things up, now waiting for results queue to drain 8119 1726773085.09947: waiting for pending results... 11282 1726773085.09988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11282 1726773085.10045: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f7 11282 1726773085.10094: calling self._execute() 11282 1726773085.11889: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11282 1726773085.12005: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11282 1726773085.12073: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11282 1726773085.12121: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11282 1726773085.12161: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11282 1726773085.12203: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11282 1726773085.12265: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11282 1726773085.12302: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11282 1726773085.12328: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11282 1726773085.12452: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11282 1726773085.12479: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11282 1726773085.12506: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11282 1726773085.12872: _execute() done 11282 1726773085.12881: dumping result to json 11282 1726773085.12886: done dumping result, returning 11282 1726773085.12892: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [12a3200b-1e9d-1dbd-cc52-0000000005f7] 11282 1726773085.12906: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f7 11282 1726773085.12936: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f7 11282 1726773085.12940: WORKER PROCESS EXITING 8119 1726773085.13279: no more pending results, returning what we have 8119 1726773085.13289: in VariableManager get_vars() 8119 1726773085.13331: Calling all_inventory to load vars for managed_node2 8119 1726773085.13340: Calling groups_inventory to load vars for managed_node2 8119 1726773085.13344: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.13368: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13379: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.13392: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13410: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.13429: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13440: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.13455: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13485: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13512: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.13830: done with get_vars() 8119 1726773085.13893: we have included files to process 8119 1726773085.13897: generating all_blocks data 8119 1726773085.13901: done generating all_blocks data 8119 1726773085.13906: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773085.13912: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773085.13917: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8119 1726773085.14238: done processing included file 8119 1726773085.14241: iterating over new_blocks loaded from include file 8119 1726773085.14244: in VariableManager get_vars() 8119 1726773085.14274: done with get_vars() 8119 1726773085.14279: filtering new block on tags 8119 1726773085.14369: done filtering new block on tags 8119 1726773085.14387: done iterating over new_blocks loaded from include file 8119 1726773085.14391: extending task lists for all hosts with included blocks 8119 1726773085.14821: done extending task lists 8119 1726773085.14825: done processing included files 8119 1726773085.14826: results queue empty 8119 1726773085.14828: checking for any_errors_fatal 8119 1726773085.14832: done checking for any_errors_fatal 8119 1726773085.14833: checking for max_fail_percentage 8119 1726773085.14834: done checking for max_fail_percentage 8119 1726773085.14836: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.14837: done checking to see if all hosts have failed 8119 1726773085.14838: getting the remaining hosts for this loop 8119 1726773085.14840: done getting the remaining hosts for this loop 8119 1726773085.14844: building list of next tasks for hosts 8119 1726773085.14845: getting the next task for host managed_node2 8119 1726773085.14850: done getting next task for host managed_node2 8119 1726773085.14853: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773085.14856: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.14858: done building task lists 8119 1726773085.14860: counting tasks in each state of execution 8119 1726773085.14864: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.14866: advancing hosts in ITERATING_TASKS 8119 1726773085.14868: starting to advance hosts 8119 1726773085.14870: getting the next task for host managed_node2 8119 1726773085.14875: done getting next task for host managed_node2 8119 1726773085.14878: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773085.14882: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.14886: done advancing hosts to next task 8119 1726773085.14895: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.14898: getting variables 8119 1726773085.14900: in VariableManager get_vars() 8119 1726773085.14917: Calling all_inventory to load vars for managed_node2 8119 1726773085.14921: Calling groups_inventory to load vars for managed_node2 8119 1726773085.14923: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.14938: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.14946: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.14956: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.14964: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.14974: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.14981: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.14993: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.15020: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.15035: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.15223: done with get_vars() 8119 1726773085.15236: done getting variables 8119 1726773085.15241: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.15243: done copying, going to template now 8119 1726773085.15245: done templating 8119 1726773085.15246: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.056) 0:01:19.709 **** 8119 1726773085.15261: sending task start callback 8119 1726773085.15263: entering _queue_task() for managed_node2/command 8119 1726773085.15412: worker is 1 (out of 1 available) 8119 1726773085.15451: exiting _queue_task() for managed_node2/command 8119 1726773085.15527: done queuing things up, now waiting for results queue to drain 8119 1726773085.15533: waiting for pending results... 11285 1726773085.15592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11285 1726773085.15650: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000007c4 11285 1726773085.15697: calling self._execute() 11285 1726773085.15873: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11285 1726773085.15921: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11285 1726773085.15936: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11285 1726773085.15947: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11285 1726773085.15955: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11285 1726773085.16079: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11285 1726773085.16102: starting attempt loop 11285 1726773085.16105: running the handler 11285 1726773085.16116: _low_level_execute_command(): starting 11285 1726773085.16121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11285 1726773085.18769: stdout chunk (state=2): >>>/root <<< 11285 1726773085.18921: stderr chunk (state=3): >>><<< 11285 1726773085.18930: stdout chunk (state=3): >>><<< 11285 1726773085.18956: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11285 1726773085.18974: _low_level_execute_command(): starting 11285 1726773085.18981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674 `" && echo ansible-tmp-1726773085.189651-11285-127095185738674="` echo /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674 `" ) && sleep 0' 11285 1726773085.21824: stdout chunk (state=2): >>>ansible-tmp-1726773085.189651-11285-127095185738674=/root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674 <<< 11285 1726773085.22010: stderr chunk (state=3): >>><<< 11285 1726773085.22019: stdout chunk (state=3): >>><<< 11285 1726773085.22040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.189651-11285-127095185738674=/root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674 , stderr= 11285 1726773085.22175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11285 1726773085.22242: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/AnsiballZ_command.py 11285 1726773085.22588: Sending initial data 11285 1726773085.22603: Sent initial data (154 bytes) 11285 1726773085.25129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpev7j1f89 /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/AnsiballZ_command.py <<< 11285 1726773085.26148: stderr chunk (state=3): >>><<< 11285 1726773085.26155: stdout chunk (state=3): >>><<< 11285 1726773085.26178: done transferring module to remote 11285 1726773085.26195: _low_level_execute_command(): starting 11285 1726773085.26202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/ /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/AnsiballZ_command.py && sleep 0' 11285 1726773085.28792: stderr chunk (state=2): >>><<< 11285 1726773085.28804: stdout chunk (state=2): >>><<< 11285 1726773085.28827: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726773085.28832: _low_level_execute_command(): starting 11285 1726773085.28839: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/AnsiballZ_command.py && sleep 0' 11285 1726773085.54505: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:25.435060", "end": "2024-09-19 15:11:25.542961", "delta": "0:00:00.107901", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11285 1726773085.55743: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11285 1726773085.55795: stderr chunk (state=3): >>><<< 11285 1726773085.55803: stdout chunk (state=3): >>><<< 11285 1726773085.55826: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:25.435060", "end": "2024-09-19 15:11:25.542961", "delta": "0:00:00.107901", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11285 1726773085.55858: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11285 1726773085.55868: _low_level_execute_command(): starting 11285 1726773085.55873: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.189651-11285-127095185738674/ > /dev/null 2>&1 && sleep 0' 11285 1726773085.58588: stderr chunk (state=2): >>><<< 11285 1726773085.58602: stdout chunk (state=2): >>><<< 11285 1726773085.58624: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726773085.58631: handler run complete 11285 1726773085.58673: attempt loop complete, returning result 11285 1726773085.58688: _execute() done 11285 1726773085.58691: dumping result to json 11285 1726773085.58695: done dumping result, returning 11285 1726773085.58711: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [12a3200b-1e9d-1dbd-cc52-0000000007c4] 11285 1726773085.58726: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c4 11285 1726773085.58764: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c4 11285 1726773085.58769: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.107901", "end": "2024-09-19 15:11:25.542961", "rc": 0, "start": "2024-09-19 15:11:25.435060" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773085.59051: no more pending results, returning what we have 8119 1726773085.59057: results queue empty 8119 1726773085.59059: checking for any_errors_fatal 8119 1726773085.59062: done checking for any_errors_fatal 8119 1726773085.59064: checking for max_fail_percentage 8119 1726773085.59066: done checking for max_fail_percentage 8119 1726773085.59067: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.59069: done checking to see if all hosts have failed 8119 1726773085.59070: getting the remaining hosts for this loop 8119 1726773085.59072: done getting the remaining hosts for this loop 8119 1726773085.59077: building list of next tasks for hosts 8119 1726773085.59079: getting the next task for host managed_node2 8119 1726773085.59090: done getting next task for host managed_node2 8119 1726773085.59095: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773085.59099: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.59101: done building task lists 8119 1726773085.59102: counting tasks in each state of execution 8119 1726773085.59105: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.59107: advancing hosts in ITERATING_TASKS 8119 1726773085.59110: starting to advance hosts 8119 1726773085.59112: getting the next task for host managed_node2 8119 1726773085.59116: done getting next task for host managed_node2 8119 1726773085.59118: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773085.59120: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.59122: done advancing hosts to next task 8119 1726773085.59134: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.59137: getting variables 8119 1726773085.59139: in VariableManager get_vars() 8119 1726773085.59165: Calling all_inventory to load vars for managed_node2 8119 1726773085.59168: Calling groups_inventory to load vars for managed_node2 8119 1726773085.59171: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.59196: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59214: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.59227: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59236: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.59247: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59253: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.59263: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59281: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59300: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.59513: done with get_vars() 8119 1726773085.59525: done getting variables 8119 1726773085.59531: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.59532: done copying, going to template now 8119 1726773085.59534: done templating 8119 1726773085.59535: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.442) 0:01:20.151 **** 8119 1726773085.59555: sending task start callback 8119 1726773085.59557: entering _queue_task() for managed_node2/shell 8119 1726773085.59695: worker is 1 (out of 1 available) 8119 1726773085.59737: exiting _queue_task() for managed_node2/shell 8119 1726773085.59813: done queuing things up, now waiting for results queue to drain 8119 1726773085.59819: waiting for pending results... 11295 1726773085.59877: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11295 1726773085.59943: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000007c5 11295 1726773085.59989: calling self._execute() 11295 1726773085.61774: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11295 1726773085.61862: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11295 1726773085.61929: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11295 1726773085.61960: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11295 1726773085.61989: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11295 1726773085.62018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11295 1726773085.62066: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11295 1726773085.62090: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11295 1726773085.62107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11295 1726773085.62193: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11295 1726773085.62211: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11295 1726773085.62225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11295 1726773085.62498: when evaluation is False, skipping this task 11295 1726773085.62503: _execute() done 11295 1726773085.62505: dumping result to json 11295 1726773085.62507: done dumping result, returning 11295 1726773085.62511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [12a3200b-1e9d-1dbd-cc52-0000000007c5] 11295 1726773085.62520: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c5 11295 1726773085.62548: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c5 11295 1726773085.62552: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773085.62695: no more pending results, returning what we have 8119 1726773085.62702: results queue empty 8119 1726773085.62704: checking for any_errors_fatal 8119 1726773085.62714: done checking for any_errors_fatal 8119 1726773085.62716: checking for max_fail_percentage 8119 1726773085.62720: done checking for max_fail_percentage 8119 1726773085.62722: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.62724: done checking to see if all hosts have failed 8119 1726773085.62725: getting the remaining hosts for this loop 8119 1726773085.62728: done getting the remaining hosts for this loop 8119 1726773085.62736: building list of next tasks for hosts 8119 1726773085.62738: getting the next task for host managed_node2 8119 1726773085.62747: done getting next task for host managed_node2 8119 1726773085.62752: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773085.62756: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.62759: done building task lists 8119 1726773085.62761: counting tasks in each state of execution 8119 1726773085.62764: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.62767: advancing hosts in ITERATING_TASKS 8119 1726773085.62769: starting to advance hosts 8119 1726773085.62771: getting the next task for host managed_node2 8119 1726773085.62775: done getting next task for host managed_node2 8119 1726773085.62778: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773085.62781: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.62785: done advancing hosts to next task 8119 1726773085.62800: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.62804: getting variables 8119 1726773085.62807: in VariableManager get_vars() 8119 1726773085.62845: Calling all_inventory to load vars for managed_node2 8119 1726773085.62850: Calling groups_inventory to load vars for managed_node2 8119 1726773085.62854: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.62881: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.62900: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.62920: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.62933: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.62949: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.62957: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.62967: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.62988: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.63003: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.63241: done with get_vars() 8119 1726773085.63252: done getting variables 8119 1726773085.63257: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.63258: done copying, going to template now 8119 1726773085.63260: done templating 8119 1726773085.63261: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.037) 0:01:20.189 **** 8119 1726773085.63278: sending task start callback 8119 1726773085.63279: entering _queue_task() for managed_node2/fail 8119 1726773085.63418: worker is 1 (out of 1 available) 8119 1726773085.63456: exiting _queue_task() for managed_node2/fail 8119 1726773085.63534: done queuing things up, now waiting for results queue to drain 8119 1726773085.63540: waiting for pending results... 11297 1726773085.63601: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11297 1726773085.63661: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000007c6 11297 1726773085.63711: calling self._execute() 11297 1726773085.65435: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11297 1726773085.65543: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11297 1726773085.65595: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11297 1726773085.65628: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11297 1726773085.65657: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11297 1726773085.65687: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11297 1726773085.65728: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11297 1726773085.65758: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11297 1726773085.65775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11297 1726773085.65856: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11297 1726773085.65876: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11297 1726773085.65893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11297 1726773085.66177: when evaluation is False, skipping this task 11297 1726773085.66182: _execute() done 11297 1726773085.66186: dumping result to json 11297 1726773085.66188: done dumping result, returning 11297 1726773085.66193: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [12a3200b-1e9d-1dbd-cc52-0000000007c6] 11297 1726773085.66201: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c6 11297 1726773085.66228: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000007c6 11297 1726773085.66232: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773085.66432: no more pending results, returning what we have 8119 1726773085.66437: results queue empty 8119 1726773085.66439: checking for any_errors_fatal 8119 1726773085.66443: done checking for any_errors_fatal 8119 1726773085.66445: checking for max_fail_percentage 8119 1726773085.66448: done checking for max_fail_percentage 8119 1726773085.66450: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.66452: done checking to see if all hosts have failed 8119 1726773085.66454: getting the remaining hosts for this loop 8119 1726773085.66456: done getting the remaining hosts for this loop 8119 1726773085.66464: building list of next tasks for hosts 8119 1726773085.66466: getting the next task for host managed_node2 8119 1726773085.66476: done getting next task for host managed_node2 8119 1726773085.66481: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773085.66488: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.66493: done building task lists 8119 1726773085.66495: counting tasks in each state of execution 8119 1726773085.66499: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.66501: advancing hosts in ITERATING_TASKS 8119 1726773085.66504: starting to advance hosts 8119 1726773085.66506: getting the next task for host managed_node2 8119 1726773085.66514: done getting next task for host managed_node2 8119 1726773085.66517: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773085.66520: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.66522: done advancing hosts to next task 8119 1726773085.66535: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.66538: getting variables 8119 1726773085.66541: in VariableManager get_vars() 8119 1726773085.66568: Calling all_inventory to load vars for managed_node2 8119 1726773085.66571: Calling groups_inventory to load vars for managed_node2 8119 1726773085.66573: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.66600: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66615: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.66626: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66635: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.66645: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66651: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.66661: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66678: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66694: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.66912: done with get_vars() 8119 1726773085.66923: done getting variables 8119 1726773085.66929: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.66932: done copying, going to template now 8119 1726773085.66935: done templating 8119 1726773085.66936: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.036) 0:01:20.225 **** 8119 1726773085.66953: sending task start callback 8119 1726773085.66955: entering _queue_task() for managed_node2/set_fact 8119 1726773085.67085: worker is 1 (out of 1 available) 8119 1726773085.67126: exiting _queue_task() for managed_node2/set_fact 8119 1726773085.67196: done queuing things up, now waiting for results queue to drain 8119 1726773085.67201: waiting for pending results... 11299 1726773085.67269: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11299 1726773085.67322: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f8 11299 1726773085.67367: calling self._execute() 11299 1726773085.67525: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11299 1726773085.67568: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11299 1726773085.67580: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11299 1726773085.67592: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11299 1726773085.67599: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11299 1726773085.67724: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11299 1726773085.67748: starting attempt loop 11299 1726773085.67752: running the handler 11299 1726773085.67772: handler run complete 11299 1726773085.67776: attempt loop complete, returning result 11299 1726773085.67778: _execute() done 11299 1726773085.67779: dumping result to json 11299 1726773085.67781: done dumping result, returning 11299 1726773085.67790: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000005f8] 11299 1726773085.67798: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f8 11299 1726773085.67828: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f8 11299 1726773085.67832: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8119 1726773085.67995: no more pending results, returning what we have 8119 1726773085.67999: results queue empty 8119 1726773085.68002: checking for any_errors_fatal 8119 1726773085.68007: done checking for any_errors_fatal 8119 1726773085.68012: checking for max_fail_percentage 8119 1726773085.68015: done checking for max_fail_percentage 8119 1726773085.68017: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.68019: done checking to see if all hosts have failed 8119 1726773085.68021: getting the remaining hosts for this loop 8119 1726773085.68023: done getting the remaining hosts for this loop 8119 1726773085.68030: building list of next tasks for hosts 8119 1726773085.68033: getting the next task for host managed_node2 8119 1726773085.68040: done getting next task for host managed_node2 8119 1726773085.68043: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773085.68047: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.68050: done building task lists 8119 1726773085.68052: counting tasks in each state of execution 8119 1726773085.68056: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.68058: advancing hosts in ITERATING_TASKS 8119 1726773085.68060: starting to advance hosts 8119 1726773085.68062: getting the next task for host managed_node2 8119 1726773085.68066: done getting next task for host managed_node2 8119 1726773085.68069: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773085.68072: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.68074: done advancing hosts to next task 8119 1726773085.68088: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.68092: getting variables 8119 1726773085.68095: in VariableManager get_vars() 8119 1726773085.68125: Calling all_inventory to load vars for managed_node2 8119 1726773085.68128: Calling groups_inventory to load vars for managed_node2 8119 1726773085.68131: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.68150: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68160: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.68170: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68178: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.68195: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68203: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.68215: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68234: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68247: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.68478: done with get_vars() 8119 1726773085.68491: done getting variables 8119 1726773085.68496: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.68498: done copying, going to template now 8119 1726773085.68499: done templating 8119 1726773085.68501: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.015) 0:01:20.241 **** 8119 1726773085.68518: sending task start callback 8119 1726773085.68520: entering _queue_task() for managed_node2/set_fact 8119 1726773085.68635: worker is 1 (out of 1 available) 8119 1726773085.68671: exiting _queue_task() for managed_node2/set_fact 8119 1726773085.68745: done queuing things up, now waiting for results queue to drain 8119 1726773085.68750: waiting for pending results... 11301 1726773085.68813: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11301 1726773085.68861: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000005f9 11301 1726773085.68908: calling self._execute() 11301 1726773085.70950: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11301 1726773085.71166: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11301 1726773085.71223: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11301 1726773085.71252: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11301 1726773085.71282: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11301 1726773085.71322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11301 1726773085.71365: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11301 1726773085.71395: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11301 1726773085.71418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11301 1726773085.71495: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11301 1726773085.71520: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11301 1726773085.71539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11301 1726773085.71773: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11301 1726773085.71777: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11301 1726773085.71779: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11301 1726773085.71781: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11301 1726773085.71789: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11301 1726773085.71791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11301 1726773085.71793: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11301 1726773085.71796: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11301 1726773085.71798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11301 1726773085.71816: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11301 1726773085.71819: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11301 1726773085.71821: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11301 1726773085.71866: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11301 1726773085.71902: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11301 1726773085.71917: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11301 1726773085.71927: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11301 1726773085.71932: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11301 1726773085.72034: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11301 1726773085.72042: starting attempt loop 11301 1726773085.72044: running the handler 11301 1726773085.72054: handler run complete 11301 1726773085.72057: attempt loop complete, returning result 11301 1726773085.72059: _execute() done 11301 1726773085.72060: dumping result to json 11301 1726773085.72062: done dumping result, returning 11301 1726773085.72066: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [12a3200b-1e9d-1dbd-cc52-0000000005f9] 11301 1726773085.72073: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f9 11301 1726773085.72120: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000005f9 11301 1726773085.72163: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8119 1726773085.72308: no more pending results, returning what we have 8119 1726773085.72313: results queue empty 8119 1726773085.72316: checking for any_errors_fatal 8119 1726773085.72320: done checking for any_errors_fatal 8119 1726773085.72321: checking for max_fail_percentage 8119 1726773085.72324: done checking for max_fail_percentage 8119 1726773085.72326: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.72328: done checking to see if all hosts have failed 8119 1726773085.72330: getting the remaining hosts for this loop 8119 1726773085.72333: done getting the remaining hosts for this loop 8119 1726773085.72341: building list of next tasks for hosts 8119 1726773085.72343: getting the next task for host managed_node2 8119 1726773085.72352: done getting next task for host managed_node2 8119 1726773085.72355: ^ task is: TASK: Force handlers 8119 1726773085.72358: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.72360: done building task lists 8119 1726773085.72362: counting tasks in each state of execution 8119 1726773085.72366: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.72368: advancing hosts in ITERATING_TASKS 8119 1726773085.72370: starting to advance hosts 8119 1726773085.72372: getting the next task for host managed_node2 8119 1726773085.72377: done getting next task for host managed_node2 8119 1726773085.72379: ^ task is: TASK: Force handlers 8119 1726773085.72381: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.72386: done advancing hosts to next task META: ran handlers 8119 1726773085.72412: done queuing things up, now waiting for results queue to drain 8119 1726773085.72415: results queue empty 8119 1726773085.72417: checking for any_errors_fatal 8119 1726773085.72420: done checking for any_errors_fatal 8119 1726773085.72422: checking for max_fail_percentage 8119 1726773085.72424: done checking for max_fail_percentage 8119 1726773085.72426: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.72427: done checking to see if all hosts have failed 8119 1726773085.72429: getting the remaining hosts for this loop 8119 1726773085.72431: done getting the remaining hosts for this loop 8119 1726773085.72437: building list of next tasks for hosts 8119 1726773085.72440: getting the next task for host managed_node2 8119 1726773085.72443: done getting next task for host managed_node2 8119 1726773085.72445: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773085.72447: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.72449: done building task lists 8119 1726773085.72450: counting tasks in each state of execution 8119 1726773085.72452: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.72453: advancing hosts in ITERATING_TASKS 8119 1726773085.72454: starting to advance hosts 8119 1726773085.72456: getting the next task for host managed_node2 8119 1726773085.72457: done getting next task for host managed_node2 8119 1726773085.72459: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773085.72460: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.72462: done advancing hosts to next task 8119 1726773085.72469: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.72472: getting variables 8119 1726773085.72474: in VariableManager get_vars() 8119 1726773085.72502: Calling all_inventory to load vars for managed_node2 8119 1726773085.72506: Calling groups_inventory to load vars for managed_node2 8119 1726773085.72509: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.72529: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72540: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.72551: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72562: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.72574: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72581: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.72592: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72613: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72627: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.72832: done with get_vars() 8119 1726773085.72842: done getting variables 8119 1726773085.72846: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.72848: done copying, going to template now 8119 1726773085.72849: done templating 8119 1726773085.72851: here goes the callback... TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:162 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.043) 0:01:20.285 **** 8119 1726773085.72865: sending task start callback 8119 1726773085.72867: entering _queue_task() for managed_node2/assert 8119 1726773085.72995: worker is 1 (out of 1 available) 8119 1726773085.73032: exiting _queue_task() for managed_node2/assert 8119 1726773085.73103: done queuing things up, now waiting for results queue to drain 8119 1726773085.73108: waiting for pending results... 11306 1726773085.73172: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11306 1726773085.73219: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000025 11306 1726773085.73264: calling self._execute() 11306 1726773085.73417: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11306 1726773085.73456: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11306 1726773085.73469: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11306 1726773085.73481: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11306 1726773085.73492: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11306 1726773085.73621: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11306 1726773085.73643: starting attempt loop 11306 1726773085.73647: running the handler 11306 1726773085.75358: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11306 1726773085.75444: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11306 1726773085.75499: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11306 1726773085.75530: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11306 1726773085.75560: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11306 1726773085.75591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11306 1726773085.75638: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11306 1726773085.75662: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11306 1726773085.75681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11306 1726773085.75760: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11306 1726773085.75780: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11306 1726773085.75810: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11306 1726773085.76078: handler run complete 11306 1726773085.76086: attempt loop complete, returning result 11306 1726773085.76089: _execute() done 11306 1726773085.76090: dumping result to json 11306 1726773085.76092: done dumping result, returning 11306 1726773085.76096: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [12a3200b-1e9d-1dbd-cc52-000000000025] 11306 1726773085.76103: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000025 11306 1726773085.76134: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000025 11306 1726773085.76137: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773085.76311: no more pending results, returning what we have 8119 1726773085.76317: results queue empty 8119 1726773085.76319: checking for any_errors_fatal 8119 1726773085.76322: done checking for any_errors_fatal 8119 1726773085.76324: checking for max_fail_percentage 8119 1726773085.76327: done checking for max_fail_percentage 8119 1726773085.76329: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.76331: done checking to see if all hosts have failed 8119 1726773085.76333: getting the remaining hosts for this loop 8119 1726773085.76335: done getting the remaining hosts for this loop 8119 1726773085.76342: building list of next tasks for hosts 8119 1726773085.76345: getting the next task for host managed_node2 8119 1726773085.76351: done getting next task for host managed_node2 8119 1726773085.76354: ^ task is: TASK: Ensure role reported changed 8119 1726773085.76357: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.76359: done building task lists 8119 1726773085.76360: counting tasks in each state of execution 8119 1726773085.76364: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.76366: advancing hosts in ITERATING_TASKS 8119 1726773085.76369: starting to advance hosts 8119 1726773085.76371: getting the next task for host managed_node2 8119 1726773085.76374: done getting next task for host managed_node2 8119 1726773085.76376: ^ task is: TASK: Ensure role reported changed 8119 1726773085.76378: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.76380: done advancing hosts to next task 8119 1726773085.76396: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.76400: getting variables 8119 1726773085.76404: in VariableManager get_vars() 8119 1726773085.76437: Calling all_inventory to load vars for managed_node2 8119 1726773085.76442: Calling groups_inventory to load vars for managed_node2 8119 1726773085.76444: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.76465: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76475: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.76489: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76500: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.76512: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76518: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.76528: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76545: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76559: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.76760: done with get_vars() 8119 1726773085.76770: done getting variables 8119 1726773085.76774: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.76776: done copying, going to template now 8119 1726773085.76778: done templating 8119 1726773085.76779: here goes the callback... TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:166 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.039) 0:01:20.324 **** 8119 1726773085.76796: sending task start callback 8119 1726773085.76799: entering _queue_task() for managed_node2/assert 8119 1726773085.76925: worker is 1 (out of 1 available) 8119 1726773085.76963: exiting _queue_task() for managed_node2/assert 8119 1726773085.77033: done queuing things up, now waiting for results queue to drain 8119 1726773085.77038: waiting for pending results... 11308 1726773085.77102: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11308 1726773085.77146: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000026 11308 1726773085.77191: calling self._execute() 11308 1726773085.77344: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11308 1726773085.77381: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11308 1726773085.77397: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11308 1726773085.77411: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11308 1726773085.77418: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11308 1726773085.77543: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11308 1726773085.77567: starting attempt loop 11308 1726773085.77570: running the handler 11308 1726773085.79272: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11308 1726773085.79353: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11308 1726773085.79403: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11308 1726773085.79432: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11308 1726773085.79472: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11308 1726773085.79505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11308 1726773085.79548: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11308 1726773085.79573: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11308 1726773085.79592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11308 1726773085.79668: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11308 1726773085.79690: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11308 1726773085.79705: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11308 1726773085.79960: handler run complete 11308 1726773085.79965: attempt loop complete, returning result 11308 1726773085.79967: _execute() done 11308 1726773085.79969: dumping result to json 11308 1726773085.79970: done dumping result, returning 11308 1726773085.79974: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [12a3200b-1e9d-1dbd-cc52-000000000026] 11308 1726773085.79980: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000026 11308 1726773085.80007: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000026 11308 1726773085.80013: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773085.80203: no more pending results, returning what we have 8119 1726773085.80207: results queue empty 8119 1726773085.80209: checking for any_errors_fatal 8119 1726773085.80213: done checking for any_errors_fatal 8119 1726773085.80215: checking for max_fail_percentage 8119 1726773085.80218: done checking for max_fail_percentage 8119 1726773085.80220: checking to see if all hosts have failed and the running result is not ok 8119 1726773085.80222: done checking to see if all hosts have failed 8119 1726773085.80224: getting the remaining hosts for this loop 8119 1726773085.80227: done getting the remaining hosts for this loop 8119 1726773085.80234: building list of next tasks for hosts 8119 1726773085.80236: getting the next task for host managed_node2 8119 1726773085.80242: done getting next task for host managed_node2 8119 1726773085.80245: ^ task is: TASK: Check sysctl 8119 1726773085.80248: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.80250: done building task lists 8119 1726773085.80252: counting tasks in each state of execution 8119 1726773085.80255: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773085.80257: advancing hosts in ITERATING_TASKS 8119 1726773085.80259: starting to advance hosts 8119 1726773085.80261: getting the next task for host managed_node2 8119 1726773085.80264: done getting next task for host managed_node2 8119 1726773085.80267: ^ task is: TASK: Check sysctl 8119 1726773085.80269: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773085.80270: done advancing hosts to next task 8119 1726773085.80281: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773085.80286: getting variables 8119 1726773085.80290: in VariableManager get_vars() 8119 1726773085.80316: Calling all_inventory to load vars for managed_node2 8119 1726773085.80321: Calling groups_inventory to load vars for managed_node2 8119 1726773085.80323: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773085.80343: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80353: Calling all_plugins_play to load vars for managed_node2 8119 1726773085.80363: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80371: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773085.80381: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80390: Calling groups_plugins_play to load vars for managed_node2 8119 1726773085.80401: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80424: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80439: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773085.80646: done with get_vars() 8119 1726773085.80656: done getting variables 8119 1726773085.80660: sending task start callback, copying the task so we can template it temporarily 8119 1726773085.80662: done copying, going to template now 8119 1726773085.80664: done templating 8119 1726773085.80665: here goes the callback... TASK [Check sysctl] ************************************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:170 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.038) 0:01:20.363 **** 8119 1726773085.80679: sending task start callback 8119 1726773085.80681: entering _queue_task() for managed_node2/shell 8119 1726773085.80804: worker is 1 (out of 1 available) 8119 1726773085.80840: exiting _queue_task() for managed_node2/shell 8119 1726773085.80911: done queuing things up, now waiting for results queue to drain 8119 1726773085.80916: waiting for pending results... 11310 1726773085.80977: running TaskExecutor() for managed_node2/TASK: Check sysctl 11310 1726773085.81023: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000027 11310 1726773085.81069: calling self._execute() 11310 1726773085.81223: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11310 1726773085.81262: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11310 1726773085.81275: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11310 1726773085.81291: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11310 1726773085.81299: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11310 1726773085.81424: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11310 1726773085.81448: starting attempt loop 11310 1726773085.81452: running the handler 11310 1726773085.81461: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11310 1726773085.81477: _low_level_execute_command(): starting 11310 1726773085.81485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11310 1726773085.84059: stdout chunk (state=2): >>>/root <<< 11310 1726773085.84177: stderr chunk (state=3): >>><<< 11310 1726773085.84187: stdout chunk (state=3): >>><<< 11310 1726773085.84210: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11310 1726773085.84226: _low_level_execute_command(): starting 11310 1726773085.84232: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085 `" && echo ansible-tmp-1726773085.8421893-11310-116465009954085="` echo /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085 `" ) && sleep 0' 11310 1726773085.87082: stdout chunk (state=2): >>>ansible-tmp-1726773085.8421893-11310-116465009954085=/root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085 <<< 11310 1726773085.87216: stderr chunk (state=3): >>><<< 11310 1726773085.87223: stdout chunk (state=3): >>><<< 11310 1726773085.87242: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.8421893-11310-116465009954085=/root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085 , stderr= 11310 1726773085.87372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11310 1726773085.87437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/AnsiballZ_command.py 11310 1726773085.87755: Sending initial data 11310 1726773085.87773: Sent initial data (155 bytes) 11310 1726773085.90228: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp80fqselt /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/AnsiballZ_command.py <<< 11310 1726773085.91234: stderr chunk (state=3): >>><<< 11310 1726773085.91240: stdout chunk (state=3): >>><<< 11310 1726773085.91263: done transferring module to remote 11310 1726773085.91276: _low_level_execute_command(): starting 11310 1726773085.91281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/ /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/AnsiballZ_command.py && sleep 0' 11310 1726773085.93854: stderr chunk (state=2): >>><<< 11310 1726773085.93867: stdout chunk (state=2): >>><<< 11310 1726773085.93890: _low_level_execute_command() done: rc=0, stdout=, stderr= 11310 1726773085.93896: _low_level_execute_command(): starting 11310 1726773085.93903: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/AnsiballZ_command.py && sleep 0' 11310 1726773086.09150: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:26.084012", "end": "2024-09-19 15:11:26.089697", "delta": "0:00:00.005685", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11310 1726773086.10188: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11310 1726773086.10238: stderr chunk (state=3): >>><<< 11310 1726773086.10243: stdout chunk (state=3): >>><<< 11310 1726773086.10266: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:26.084012", "end": "2024-09-19 15:11:26.089697", "delta": "0:00:00.005685", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11310 1726773086.10301: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11310 1726773086.10321: _low_level_execute_command(): starting 11310 1726773086.10328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.8421893-11310-116465009954085/ > /dev/null 2>&1 && sleep 0' 11310 1726773086.12968: stderr chunk (state=2): >>><<< 11310 1726773086.12980: stdout chunk (state=2): >>><<< 11310 1726773086.13001: _low_level_execute_command() done: rc=0, stdout=, stderr= 11310 1726773086.13015: handler run complete 11310 1726773086.13025: attempt loop complete, returning result 11310 1726773086.13037: _execute() done 11310 1726773086.13039: dumping result to json 11310 1726773086.13043: done dumping result, returning 11310 1726773086.13057: done running TaskExecutor() for managed_node2/TASK: Check sysctl [12a3200b-1e9d-1dbd-cc52-000000000027] 11310 1726773086.13070: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000027 11310 1726773086.13112: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000027 11310 1726773086.13118: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "delta": "0:00:00.005685", "end": "2024-09-19 15:11:26.089697", "rc": 0, "start": "2024-09-19 15:11:26.084012" } 8119 1726773086.13382: no more pending results, returning what we have 8119 1726773086.13389: results queue empty 8119 1726773086.13391: checking for any_errors_fatal 8119 1726773086.13394: done checking for any_errors_fatal 8119 1726773086.13395: checking for max_fail_percentage 8119 1726773086.13398: done checking for max_fail_percentage 8119 1726773086.13399: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.13400: done checking to see if all hosts have failed 8119 1726773086.13402: getting the remaining hosts for this loop 8119 1726773086.13403: done getting the remaining hosts for this loop 8119 1726773086.13409: building list of next tasks for hosts 8119 1726773086.13411: getting the next task for host managed_node2 8119 1726773086.13416: done getting next task for host managed_node2 8119 1726773086.13418: ^ task is: TASK: Check sysfs after role runs 8119 1726773086.13420: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.13421: done building task lists 8119 1726773086.13423: counting tasks in each state of execution 8119 1726773086.13426: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.13427: advancing hosts in ITERATING_TASKS 8119 1726773086.13429: starting to advance hosts 8119 1726773086.13430: getting the next task for host managed_node2 8119 1726773086.13432: done getting next task for host managed_node2 8119 1726773086.13433: ^ task is: TASK: Check sysfs after role runs 8119 1726773086.13435: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.13436: done advancing hosts to next task 8119 1726773086.13448: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.13450: getting variables 8119 1726773086.13453: in VariableManager get_vars() 8119 1726773086.13478: Calling all_inventory to load vars for managed_node2 8119 1726773086.13484: Calling groups_inventory to load vars for managed_node2 8119 1726773086.13489: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.13515: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13527: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.13537: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13546: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.13556: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13562: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.13571: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13591: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13611: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.13814: done with get_vars() 8119 1726773086.13827: done getting variables 8119 1726773086.13834: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.13836: done copying, going to template now 8119 1726773086.13838: done templating 8119 1726773086.13839: here goes the callback... TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:176 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.331) 0:01:20.694 **** 8119 1726773086.13853: sending task start callback 8119 1726773086.13855: entering _queue_task() for managed_node2/command 8119 1726773086.13975: worker is 1 (out of 1 available) 8119 1726773086.14018: exiting _queue_task() for managed_node2/command 8119 1726773086.14089: done queuing things up, now waiting for results queue to drain 8119 1726773086.14095: waiting for pending results... 11319 1726773086.14152: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11319 1726773086.14195: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000028 11319 1726773086.14242: calling self._execute() 11319 1726773086.14503: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11319 1726773086.14556: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11319 1726773086.14569: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11319 1726773086.14579: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11319 1726773086.14588: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11319 1726773086.14711: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11319 1726773086.14721: starting attempt loop 11319 1726773086.14723: running the handler 11319 1726773086.14733: _low_level_execute_command(): starting 11319 1726773086.14738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11319 1726773086.17140: stdout chunk (state=2): >>>/root <<< 11319 1726773086.17257: stderr chunk (state=3): >>><<< 11319 1726773086.17262: stdout chunk (state=3): >>><<< 11319 1726773086.17281: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11319 1726773086.17298: _low_level_execute_command(): starting 11319 1726773086.17304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453 `" && echo ansible-tmp-1726773086.172921-11319-147328600645453="` echo /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453 `" ) && sleep 0' 11319 1726773086.20268: stdout chunk (state=2): >>>ansible-tmp-1726773086.172921-11319-147328600645453=/root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453 <<< 11319 1726773086.20396: stderr chunk (state=3): >>><<< 11319 1726773086.20402: stdout chunk (state=3): >>><<< 11319 1726773086.20422: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.172921-11319-147328600645453=/root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453 , stderr= 11319 1726773086.20555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11319 1726773086.20623: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/AnsiballZ_command.py 11319 1726773086.20960: Sending initial data 11319 1726773086.20975: Sent initial data (154 bytes) 11319 1726773086.23424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpf460er7g /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/AnsiballZ_command.py <<< 11319 1726773086.24436: stderr chunk (state=3): >>><<< 11319 1726773086.24442: stdout chunk (state=3): >>><<< 11319 1726773086.24464: done transferring module to remote 11319 1726773086.24479: _low_level_execute_command(): starting 11319 1726773086.24486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/ /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/AnsiballZ_command.py && sleep 0' 11319 1726773086.27048: stderr chunk (state=2): >>><<< 11319 1726773086.27058: stdout chunk (state=2): >>><<< 11319 1726773086.27076: _low_level_execute_command() done: rc=0, stdout=, stderr= 11319 1726773086.27081: _low_level_execute_command(): starting 11319 1726773086.27089: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/AnsiballZ_command.py && sleep 0' 11319 1726773086.42795: stdout chunk (state=2): >>> {"cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "stdout": "60666", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:26.421771", "end": "2024-09-19 15:11:26.425775", "delta": "0:00:00.004004", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11319 1726773086.43902: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11319 1726773086.43953: stderr chunk (state=3): >>><<< 11319 1726773086.43958: stdout chunk (state=3): >>><<< 11319 1726773086.43979: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "stdout": "60666", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:26.421771", "end": "2024-09-19 15:11:26.425775", "delta": "0:00:00.004004", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11319 1726773086.44019: done with _execute_module (command, {'_raw_params': 'grep -x 60666 /sys/class/net/lo/mtu', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11319 1726773086.44035: _low_level_execute_command(): starting 11319 1726773086.44043: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.172921-11319-147328600645453/ > /dev/null 2>&1 && sleep 0' 11319 1726773086.46821: stderr chunk (state=2): >>><<< 11319 1726773086.46837: stdout chunk (state=2): >>><<< 11319 1726773086.46859: _low_level_execute_command() done: rc=0, stdout=, stderr= 11319 1726773086.46866: handler run complete 11319 1726773086.46877: attempt loop complete, returning result 11319 1726773086.46894: _execute() done 11319 1726773086.46899: dumping result to json 11319 1726773086.46905: done dumping result, returning 11319 1726773086.46920: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [12a3200b-1e9d-1dbd-cc52-000000000028] 11319 1726773086.46935: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000028 11319 1726773086.46976: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000028 11319 1726773086.47025: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.004004", "end": "2024-09-19 15:11:26.425775", "rc": 0, "start": "2024-09-19 15:11:26.421771" } STDOUT: 60666 8119 1726773086.47167: no more pending results, returning what we have 8119 1726773086.47172: results queue empty 8119 1726773086.47175: checking for any_errors_fatal 8119 1726773086.47180: done checking for any_errors_fatal 8119 1726773086.47182: checking for max_fail_percentage 8119 1726773086.47188: done checking for max_fail_percentage 8119 1726773086.47190: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.47192: done checking to see if all hosts have failed 8119 1726773086.47194: getting the remaining hosts for this loop 8119 1726773086.47196: done getting the remaining hosts for this loop 8119 1726773086.47204: building list of next tasks for hosts 8119 1726773086.47207: getting the next task for host managed_node2 8119 1726773086.47213: done getting next task for host managed_node2 8119 1726773086.47216: ^ task is: TASK: Apply kernel_settings for removing section 8119 1726773086.47220: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.47222: done building task lists 8119 1726773086.47224: counting tasks in each state of execution 8119 1726773086.47228: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.47230: advancing hosts in ITERATING_TASKS 8119 1726773086.47232: starting to advance hosts 8119 1726773086.47234: getting the next task for host managed_node2 8119 1726773086.47237: done getting next task for host managed_node2 8119 1726773086.47239: ^ task is: TASK: Apply kernel_settings for removing section 8119 1726773086.47241: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.47243: done advancing hosts to next task 8119 1726773086.47259: getting variables 8119 1726773086.47262: in VariableManager get_vars() 8119 1726773086.47297: Calling all_inventory to load vars for managed_node2 8119 1726773086.47303: Calling groups_inventory to load vars for managed_node2 8119 1726773086.47305: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.47328: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47339: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.47350: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47358: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.47368: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47374: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.47385: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47410: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47426: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.47632: done with get_vars() 8119 1726773086.47643: done getting variables 8119 1726773086.47648: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.47649: done copying, going to template now 8119 1726773086.47651: done templating 8119 1726773086.47652: here goes the callback... TASK [Apply kernel_settings for removing section] ****************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:180 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.338) 0:01:21.033 **** 8119 1726773086.47668: sending task start callback 8119 1726773086.47669: entering _queue_task() for managed_node2/include_role 8119 1726773086.47803: worker is 1 (out of 1 available) 8119 1726773086.47843: exiting _queue_task() for managed_node2/include_role 8119 1726773086.47913: done queuing things up, now waiting for results queue to drain 8119 1726773086.47919: waiting for pending results... 11328 1726773086.47986: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section 11328 1726773086.48036: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000029 11328 1726773086.48082: calling self._execute() 11328 1726773086.48244: _execute() done 11328 1726773086.48248: dumping result to json 11328 1726773086.48251: done dumping result, returning 11328 1726773086.48254: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section [12a3200b-1e9d-1dbd-cc52-000000000029] 11328 1726773086.48263: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000029 11328 1726773086.48294: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000029 11328 1726773086.48299: WORKER PROCESS EXITING 8119 1726773086.48461: no more pending results, returning what we have 8119 1726773086.48470: in VariableManager get_vars() 8119 1726773086.48518: Calling all_inventory to load vars for managed_node2 8119 1726773086.48523: Calling groups_inventory to load vars for managed_node2 8119 1726773086.48526: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.48557: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48572: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.48588: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48603: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.48622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48631: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.48644: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48670: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48693: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.48924: done with get_vars() 8119 1726773086.50703: we have included files to process 8119 1726773086.50712: generating all_blocks data 8119 1726773086.50716: done generating all_blocks data 8119 1726773086.50720: processing included file: fedora.linux_system_roles.kernel_settings 8119 1726773086.50735: in VariableManager get_vars() 8119 1726773086.50757: done with get_vars() 8119 1726773086.50810: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8119 1726773086.50867: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8119 1726773086.50895: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8119 1726773086.50955: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8119 1726773086.51293: in VariableManager get_vars() 8119 1726773086.51319: done with get_vars() 8119 1726773086.51472: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.51521: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.51628: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.51665: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.51784: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.51923: in VariableManager get_vars() 8119 1726773086.51948: done with get_vars() 8119 1726773086.52021: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8119 1726773086.52356: iterating over new_blocks loaded from include file 8119 1726773086.52361: in VariableManager get_vars() 8119 1726773086.52377: done with get_vars() 8119 1726773086.52379: filtering new block on tags 8119 1726773086.52423: done filtering new block on tags 8119 1726773086.52433: in VariableManager get_vars() 8119 1726773086.52447: done with get_vars() 8119 1726773086.52450: filtering new block on tags 8119 1726773086.52486: done filtering new block on tags 8119 1726773086.52495: in VariableManager get_vars() 8119 1726773086.52509: done with get_vars() 8119 1726773086.52512: filtering new block on tags 8119 1726773086.52618: done filtering new block on tags 8119 1726773086.52629: done iterating over new_blocks loaded from include file 8119 1726773086.52632: extending task lists for all hosts with included blocks 8119 1726773086.54325: done extending task lists 8119 1726773086.54329: done processing included files 8119 1726773086.54331: results queue empty 8119 1726773086.54332: checking for any_errors_fatal 8119 1726773086.54336: done checking for any_errors_fatal 8119 1726773086.54337: checking for max_fail_percentage 8119 1726773086.54338: done checking for max_fail_percentage 8119 1726773086.54340: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.54341: done checking to see if all hosts have failed 8119 1726773086.54342: getting the remaining hosts for this loop 8119 1726773086.54344: done getting the remaining hosts for this loop 8119 1726773086.54349: building list of next tasks for hosts 8119 1726773086.54352: getting the next task for host managed_node2 8119 1726773086.54356: done getting next task for host managed_node2 8119 1726773086.54359: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773086.54362: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.54364: done building task lists 8119 1726773086.54365: counting tasks in each state of execution 8119 1726773086.54367: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.54369: advancing hosts in ITERATING_TASKS 8119 1726773086.54370: starting to advance hosts 8119 1726773086.54372: getting the next task for host managed_node2 8119 1726773086.54374: done getting next task for host managed_node2 8119 1726773086.54376: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773086.54378: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.54380: done advancing hosts to next task 8119 1726773086.54387: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.54390: getting variables 8119 1726773086.54392: in VariableManager get_vars() 8119 1726773086.54407: Calling all_inventory to load vars for managed_node2 8119 1726773086.54410: Calling groups_inventory to load vars for managed_node2 8119 1726773086.54413: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.54429: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54437: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.54447: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54455: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.54469: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54477: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.54490: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54509: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54523: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.54716: done with get_vars() 8119 1726773086.54726: done getting variables 8119 1726773086.54731: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.54732: done copying, going to template now 8119 1726773086.54734: done templating 8119 1726773086.54736: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.070) 0:01:21.103 **** 8119 1726773086.54751: sending task start callback 8119 1726773086.54753: entering _queue_task() for managed_node2/fail 8119 1726773086.54910: worker is 1 (out of 1 available) 8119 1726773086.54949: exiting _queue_task() for managed_node2/fail 8119 1726773086.55022: done queuing things up, now waiting for results queue to drain 8119 1726773086.55028: waiting for pending results... 11330 1726773086.55092: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11330 1726773086.55148: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a0 11330 1726773086.55197: calling self._execute() 11330 1726773086.59934: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11330 1726773086.60015: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11330 1726773086.60072: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11330 1726773086.60103: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11330 1726773086.60144: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11330 1726773086.60179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11330 1726773086.60237: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11330 1726773086.60265: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11330 1726773086.60290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11330 1726773086.60401: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11330 1726773086.60428: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11330 1726773086.60448: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11330 1726773086.60757: when evaluation is False, skipping this task 11330 1726773086.60763: _execute() done 11330 1726773086.60766: dumping result to json 11330 1726773086.60769: done dumping result, returning 11330 1726773086.60775: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [12a3200b-1e9d-1dbd-cc52-0000000009a0] 11330 1726773086.60787: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a0 11330 1726773086.60815: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a0 11330 1726773086.60819: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.63185: no more pending results, returning what we have 8119 1726773086.63189: results queue empty 8119 1726773086.63191: checking for any_errors_fatal 8119 1726773086.63193: done checking for any_errors_fatal 8119 1726773086.63194: checking for max_fail_percentage 8119 1726773086.63196: done checking for max_fail_percentage 8119 1726773086.63197: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.63198: done checking to see if all hosts have failed 8119 1726773086.63200: getting the remaining hosts for this loop 8119 1726773086.63201: done getting the remaining hosts for this loop 8119 1726773086.63206: building list of next tasks for hosts 8119 1726773086.63209: getting the next task for host managed_node2 8119 1726773086.63215: done getting next task for host managed_node2 8119 1726773086.63219: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773086.63221: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.63223: done building task lists 8119 1726773086.63224: counting tasks in each state of execution 8119 1726773086.63227: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.63228: advancing hosts in ITERATING_TASKS 8119 1726773086.63230: starting to advance hosts 8119 1726773086.63231: getting the next task for host managed_node2 8119 1726773086.63234: done getting next task for host managed_node2 8119 1726773086.63236: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773086.63239: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.63240: done advancing hosts to next task 8119 1726773086.63252: getting variables 8119 1726773086.63254: in VariableManager get_vars() 8119 1726773086.63274: Calling all_inventory to load vars for managed_node2 8119 1726773086.63277: Calling groups_inventory to load vars for managed_node2 8119 1726773086.63279: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.63300: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63311: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.63323: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63335: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.63347: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63353: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.63362: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63379: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63396: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.63588: done with get_vars() 8119 1726773086.63599: done getting variables 8119 1726773086.63603: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.63605: done copying, going to template now 8119 1726773086.63607: done templating 8119 1726773086.63609: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.088) 0:01:21.192 **** 8119 1726773086.63624: sending task start callback 8119 1726773086.63625: entering _queue_task() for managed_node2/include_tasks 8119 1726773086.63773: worker is 1 (out of 1 available) 8119 1726773086.63813: exiting _queue_task() for managed_node2/include_tasks 8119 1726773086.63886: done queuing things up, now waiting for results queue to drain 8119 1726773086.63892: waiting for pending results... 11332 1726773086.63953: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11332 1726773086.64015: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a1 11332 1726773086.64059: calling self._execute() 11332 1726773086.64166: _execute() done 11332 1726773086.64170: dumping result to json 11332 1726773086.64173: done dumping result, returning 11332 1726773086.64176: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [12a3200b-1e9d-1dbd-cc52-0000000009a1] 11332 1726773086.64187: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a1 11332 1726773086.64223: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a1 11332 1726773086.64278: WORKER PROCESS EXITING 8119 1726773086.64431: no more pending results, returning what we have 8119 1726773086.64440: in VariableManager get_vars() 8119 1726773086.64475: Calling all_inventory to load vars for managed_node2 8119 1726773086.64479: Calling groups_inventory to load vars for managed_node2 8119 1726773086.64481: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.64526: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64538: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.64550: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64559: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.64569: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64576: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.64587: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64607: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.64831: done with get_vars() 8119 1726773086.64870: we have included files to process 8119 1726773086.64872: generating all_blocks data 8119 1726773086.64874: done generating all_blocks data 8119 1726773086.64879: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773086.64881: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773086.64885: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773086.65016: plugin lookup for setup failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.65089: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773086.65165: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8119 1726773086.65278: done processing included file 8119 1726773086.65280: iterating over new_blocks loaded from include file 8119 1726773086.65284: in VariableManager get_vars() 8119 1726773086.65306: done with get_vars() 8119 1726773086.65310: filtering new block on tags 8119 1726773086.65350: done filtering new block on tags 8119 1726773086.65359: in VariableManager get_vars() 8119 1726773086.65375: done with get_vars() 8119 1726773086.65378: filtering new block on tags 8119 1726773086.65441: done filtering new block on tags 8119 1726773086.65450: in VariableManager get_vars() 8119 1726773086.65467: done with get_vars() 8119 1726773086.65469: filtering new block on tags 8119 1726773086.65516: done filtering new block on tags 8119 1726773086.65526: in VariableManager get_vars() 8119 1726773086.65543: done with get_vars() 8119 1726773086.65545: filtering new block on tags 8119 1726773086.65578: done filtering new block on tags 8119 1726773086.65587: done iterating over new_blocks loaded from include file 8119 1726773086.65589: extending task lists for all hosts with included blocks 8119 1726773086.65667: done extending task lists 8119 1726773086.65670: done processing included files 8119 1726773086.65671: results queue empty 8119 1726773086.65673: checking for any_errors_fatal 8119 1726773086.65676: done checking for any_errors_fatal 8119 1726773086.65677: checking for max_fail_percentage 8119 1726773086.65679: done checking for max_fail_percentage 8119 1726773086.65680: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.65681: done checking to see if all hosts have failed 8119 1726773086.65684: getting the remaining hosts for this loop 8119 1726773086.65686: done getting the remaining hosts for this loop 8119 1726773086.65690: building list of next tasks for hosts 8119 1726773086.65692: getting the next task for host managed_node2 8119 1726773086.65696: done getting next task for host managed_node2 8119 1726773086.65698: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773086.65701: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.65703: done building task lists 8119 1726773086.65704: counting tasks in each state of execution 8119 1726773086.65707: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.65708: advancing hosts in ITERATING_TASKS 8119 1726773086.65710: starting to advance hosts 8119 1726773086.65712: getting the next task for host managed_node2 8119 1726773086.65714: done getting next task for host managed_node2 8119 1726773086.65716: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773086.65719: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.65720: done advancing hosts to next task 8119 1726773086.65726: getting variables 8119 1726773086.65727: in VariableManager get_vars() 8119 1726773086.65740: Calling all_inventory to load vars for managed_node2 8119 1726773086.65745: Calling groups_inventory to load vars for managed_node2 8119 1726773086.65748: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.65761: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.65768: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.65778: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.65789: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.65800: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.65806: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.65817: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.65834: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.65847: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.66045: done with get_vars() 8119 1726773086.66055: done getting variables 8119 1726773086.66059: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.66061: done copying, going to template now 8119 1726773086.66062: done templating 8119 1726773086.66064: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.024) 0:01:21.217 **** 8119 1726773086.66078: sending task start callback 8119 1726773086.66080: entering _queue_task() for managed_node2/setup 8119 1726773086.66214: worker is 1 (out of 1 available) 8119 1726773086.66250: exiting _queue_task() for managed_node2/setup 8119 1726773086.66323: done queuing things up, now waiting for results queue to drain 8119 1726773086.66328: waiting for pending results... 11334 1726773086.66392: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11334 1726773086.66451: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b78 11334 1726773086.66497: calling self._execute() 11334 1726773086.68278: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11334 1726773086.68369: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11334 1726773086.68447: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11334 1726773086.68477: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11334 1726773086.68511: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11334 1726773086.68544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11334 1726773086.68592: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11334 1726773086.68618: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11334 1726773086.68638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11334 1726773086.68719: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11334 1726773086.68736: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11334 1726773086.68754: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11334 1726773086.69166: when evaluation is False, skipping this task 11334 1726773086.69171: _execute() done 11334 1726773086.69173: dumping result to json 11334 1726773086.69175: done dumping result, returning 11334 1726773086.69179: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [12a3200b-1e9d-1dbd-cc52-000000000b78] 11334 1726773086.69189: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b78 11334 1726773086.69216: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b78 11334 1726773086.69219: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.69427: no more pending results, returning what we have 8119 1726773086.69432: results queue empty 8119 1726773086.69434: checking for any_errors_fatal 8119 1726773086.69439: done checking for any_errors_fatal 8119 1726773086.69441: checking for max_fail_percentage 8119 1726773086.69444: done checking for max_fail_percentage 8119 1726773086.69446: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.69448: done checking to see if all hosts have failed 8119 1726773086.69450: getting the remaining hosts for this loop 8119 1726773086.69452: done getting the remaining hosts for this loop 8119 1726773086.69460: building list of next tasks for hosts 8119 1726773086.69462: getting the next task for host managed_node2 8119 1726773086.69473: done getting next task for host managed_node2 8119 1726773086.69478: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773086.69484: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.69488: done building task lists 8119 1726773086.69490: counting tasks in each state of execution 8119 1726773086.69493: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.69496: advancing hosts in ITERATING_TASKS 8119 1726773086.69498: starting to advance hosts 8119 1726773086.69500: getting the next task for host managed_node2 8119 1726773086.69506: done getting next task for host managed_node2 8119 1726773086.69508: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773086.69511: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.69513: done advancing hosts to next task 8119 1726773086.69525: getting variables 8119 1726773086.69527: in VariableManager get_vars() 8119 1726773086.69554: Calling all_inventory to load vars for managed_node2 8119 1726773086.69558: Calling groups_inventory to load vars for managed_node2 8119 1726773086.69560: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.69580: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69596: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.69609: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69619: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.69629: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69636: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.69645: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69663: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69676: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.69891: done with get_vars() 8119 1726773086.69901: done getting variables 8119 1726773086.69906: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.69908: done copying, going to template now 8119 1726773086.69911: done templating 8119 1726773086.69912: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.038) 0:01:21.255 **** 8119 1726773086.69931: sending task start callback 8119 1726773086.69934: entering _queue_task() for managed_node2/stat 8119 1726773086.70062: worker is 1 (out of 1 available) 8119 1726773086.70103: exiting _queue_task() for managed_node2/stat 8119 1726773086.70170: done queuing things up, now waiting for results queue to drain 8119 1726773086.70175: waiting for pending results... 11336 1726773086.70241: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11336 1726773086.70310: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b7a 11336 1726773086.70352: calling self._execute() 11336 1726773086.72531: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11336 1726773086.72619: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11336 1726773086.72674: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11336 1726773086.72707: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11336 1726773086.72743: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11336 1726773086.72774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11336 1726773086.72831: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11336 1726773086.72857: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11336 1726773086.72886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11336 1726773086.72969: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11336 1726773086.72990: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11336 1726773086.73005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11336 1726773086.73262: when evaluation is False, skipping this task 11336 1726773086.73266: _execute() done 11336 1726773086.73268: dumping result to json 11336 1726773086.73270: done dumping result, returning 11336 1726773086.73274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [12a3200b-1e9d-1dbd-cc52-000000000b7a] 11336 1726773086.73285: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7a 11336 1726773086.73316: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7a 11336 1726773086.73359: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.73504: no more pending results, returning what we have 8119 1726773086.73510: results queue empty 8119 1726773086.73513: checking for any_errors_fatal 8119 1726773086.73519: done checking for any_errors_fatal 8119 1726773086.73521: checking for max_fail_percentage 8119 1726773086.73524: done checking for max_fail_percentage 8119 1726773086.73526: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.73528: done checking to see if all hosts have failed 8119 1726773086.73530: getting the remaining hosts for this loop 8119 1726773086.73533: done getting the remaining hosts for this loop 8119 1726773086.73540: building list of next tasks for hosts 8119 1726773086.73542: getting the next task for host managed_node2 8119 1726773086.73550: done getting next task for host managed_node2 8119 1726773086.73556: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773086.73561: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.73564: done building task lists 8119 1726773086.73565: counting tasks in each state of execution 8119 1726773086.73569: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.73571: advancing hosts in ITERATING_TASKS 8119 1726773086.73573: starting to advance hosts 8119 1726773086.73575: getting the next task for host managed_node2 8119 1726773086.73579: done getting next task for host managed_node2 8119 1726773086.73582: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773086.73588: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.73590: done advancing hosts to next task 8119 1726773086.73604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.73607: getting variables 8119 1726773086.73610: in VariableManager get_vars() 8119 1726773086.73637: Calling all_inventory to load vars for managed_node2 8119 1726773086.73640: Calling groups_inventory to load vars for managed_node2 8119 1726773086.73642: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.73662: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73673: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.73684: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73698: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.73711: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73718: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.73728: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73745: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73759: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.73972: done with get_vars() 8119 1726773086.73985: done getting variables 8119 1726773086.73991: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.73992: done copying, going to template now 8119 1726773086.73995: done templating 8119 1726773086.73996: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.040) 0:01:21.296 **** 8119 1726773086.74012: sending task start callback 8119 1726773086.74014: entering _queue_task() for managed_node2/set_fact 8119 1726773086.74136: worker is 1 (out of 1 available) 8119 1726773086.74172: exiting _queue_task() for managed_node2/set_fact 8119 1726773086.74242: done queuing things up, now waiting for results queue to drain 8119 1726773086.74247: waiting for pending results... 11341 1726773086.74315: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11341 1726773086.74374: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b7b 11341 1726773086.74421: calling self._execute() 11341 1726773086.76212: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11341 1726773086.76300: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11341 1726773086.76359: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11341 1726773086.76396: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11341 1726773086.76427: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11341 1726773086.76466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11341 1726773086.76518: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11341 1726773086.76542: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11341 1726773086.76558: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11341 1726773086.76641: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11341 1726773086.76658: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11341 1726773086.76672: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11341 1726773086.76931: when evaluation is False, skipping this task 11341 1726773086.76935: _execute() done 11341 1726773086.76937: dumping result to json 11341 1726773086.76939: done dumping result, returning 11341 1726773086.76943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-000000000b7b] 11341 1726773086.76951: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7b 11341 1726773086.76979: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7b 11341 1726773086.76984: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.77111: no more pending results, returning what we have 8119 1726773086.77116: results queue empty 8119 1726773086.77119: checking for any_errors_fatal 8119 1726773086.77122: done checking for any_errors_fatal 8119 1726773086.77124: checking for max_fail_percentage 8119 1726773086.77128: done checking for max_fail_percentage 8119 1726773086.77130: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.77131: done checking to see if all hosts have failed 8119 1726773086.77133: getting the remaining hosts for this loop 8119 1726773086.77136: done getting the remaining hosts for this loop 8119 1726773086.77143: building list of next tasks for hosts 8119 1726773086.77146: getting the next task for host managed_node2 8119 1726773086.77156: done getting next task for host managed_node2 8119 1726773086.77161: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773086.77166: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.77168: done building task lists 8119 1726773086.77170: counting tasks in each state of execution 8119 1726773086.77174: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.77176: advancing hosts in ITERATING_TASKS 8119 1726773086.77178: starting to advance hosts 8119 1726773086.77180: getting the next task for host managed_node2 8119 1726773086.77187: done getting next task for host managed_node2 8119 1726773086.77190: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773086.77194: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.77197: done advancing hosts to next task 8119 1726773086.77211: getting variables 8119 1726773086.77215: in VariableManager get_vars() 8119 1726773086.77249: Calling all_inventory to load vars for managed_node2 8119 1726773086.77254: Calling groups_inventory to load vars for managed_node2 8119 1726773086.77257: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.77285: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77302: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.77318: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77331: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.77346: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77354: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.77364: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77382: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77399: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.77631: done with get_vars() 8119 1726773086.77641: done getting variables 8119 1726773086.77645: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.77647: done copying, going to template now 8119 1726773086.77649: done templating 8119 1726773086.77650: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.036) 0:01:21.333 **** 8119 1726773086.77666: sending task start callback 8119 1726773086.77667: entering _queue_task() for managed_node2/stat 8119 1726773086.77790: worker is 1 (out of 1 available) 8119 1726773086.77827: exiting _queue_task() for managed_node2/stat 8119 1726773086.77899: done queuing things up, now waiting for results queue to drain 8119 1726773086.77905: waiting for pending results... 11343 1726773086.77967: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11343 1726773086.78029: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b7d 11343 1726773086.78073: calling self._execute() 11343 1726773086.79807: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11343 1726773086.79899: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11343 1726773086.79964: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11343 1726773086.79996: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11343 1726773086.80024: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11343 1726773086.80051: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11343 1726773086.80098: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11343 1726773086.80125: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11343 1726773086.80142: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11343 1726773086.80223: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11343 1726773086.80241: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11343 1726773086.80255: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11343 1726773086.80652: when evaluation is False, skipping this task 11343 1726773086.80657: _execute() done 11343 1726773086.80658: dumping result to json 11343 1726773086.80660: done dumping result, returning 11343 1726773086.80664: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [12a3200b-1e9d-1dbd-cc52-000000000b7d] 11343 1726773086.80671: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7d 11343 1726773086.80698: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7d 11343 1726773086.80703: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.80911: no more pending results, returning what we have 8119 1726773086.80916: results queue empty 8119 1726773086.80919: checking for any_errors_fatal 8119 1726773086.80922: done checking for any_errors_fatal 8119 1726773086.80924: checking for max_fail_percentage 8119 1726773086.80927: done checking for max_fail_percentage 8119 1726773086.80929: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.80931: done checking to see if all hosts have failed 8119 1726773086.80933: getting the remaining hosts for this loop 8119 1726773086.80936: done getting the remaining hosts for this loop 8119 1726773086.80943: building list of next tasks for hosts 8119 1726773086.80945: getting the next task for host managed_node2 8119 1726773086.80951: done getting next task for host managed_node2 8119 1726773086.80955: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773086.80958: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.80959: done building task lists 8119 1726773086.80960: counting tasks in each state of execution 8119 1726773086.80963: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.80965: advancing hosts in ITERATING_TASKS 8119 1726773086.80966: starting to advance hosts 8119 1726773086.80968: getting the next task for host managed_node2 8119 1726773086.80971: done getting next task for host managed_node2 8119 1726773086.80972: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773086.80975: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.80976: done advancing hosts to next task 8119 1726773086.80989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.80993: getting variables 8119 1726773086.80995: in VariableManager get_vars() 8119 1726773086.81022: Calling all_inventory to load vars for managed_node2 8119 1726773086.81026: Calling groups_inventory to load vars for managed_node2 8119 1726773086.81028: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.81050: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81062: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.81073: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81082: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.81095: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81102: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.81114: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81133: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81146: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.81354: done with get_vars() 8119 1726773086.81364: done getting variables 8119 1726773086.81369: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.81371: done copying, going to template now 8119 1726773086.81373: done templating 8119 1726773086.81374: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.037) 0:01:21.370 **** 8119 1726773086.81394: sending task start callback 8119 1726773086.81397: entering _queue_task() for managed_node2/set_fact 8119 1726773086.81515: worker is 1 (out of 1 available) 8119 1726773086.81550: exiting _queue_task() for managed_node2/set_fact 8119 1726773086.81624: done queuing things up, now waiting for results queue to drain 8119 1726773086.81629: waiting for pending results... 11345 1726773086.81682: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11345 1726773086.81742: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b7e 11345 1726773086.81788: calling self._execute() 11345 1726773086.83672: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11345 1726773086.83752: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11345 1726773086.83804: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11345 1726773086.83831: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11345 1726773086.83857: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11345 1726773086.83890: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11345 1726773086.83936: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11345 1726773086.83957: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11345 1726773086.83976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11345 1726773086.84062: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11345 1726773086.84080: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11345 1726773086.84103: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11345 1726773086.84351: when evaluation is False, skipping this task 11345 1726773086.84356: _execute() done 11345 1726773086.84357: dumping result to json 11345 1726773086.84359: done dumping result, returning 11345 1726773086.84363: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [12a3200b-1e9d-1dbd-cc52-000000000b7e] 11345 1726773086.84370: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7e 11345 1726773086.84397: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b7e 11345 1726773086.84444: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773086.84531: no more pending results, returning what we have 8119 1726773086.84535: results queue empty 8119 1726773086.84537: checking for any_errors_fatal 8119 1726773086.84541: done checking for any_errors_fatal 8119 1726773086.84543: checking for max_fail_percentage 8119 1726773086.84546: done checking for max_fail_percentage 8119 1726773086.84548: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.84551: done checking to see if all hosts have failed 8119 1726773086.84553: getting the remaining hosts for this loop 8119 1726773086.84556: done getting the remaining hosts for this loop 8119 1726773086.84563: building list of next tasks for hosts 8119 1726773086.84565: getting the next task for host managed_node2 8119 1726773086.84575: done getting next task for host managed_node2 8119 1726773086.84581: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773086.84587: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.84590: done building task lists 8119 1726773086.84592: counting tasks in each state of execution 8119 1726773086.84596: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.84598: advancing hosts in ITERATING_TASKS 8119 1726773086.84600: starting to advance hosts 8119 1726773086.84603: getting the next task for host managed_node2 8119 1726773086.84611: done getting next task for host managed_node2 8119 1726773086.84614: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773086.84617: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.84620: done advancing hosts to next task 8119 1726773086.84633: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.84638: getting variables 8119 1726773086.84642: in VariableManager get_vars() 8119 1726773086.84675: Calling all_inventory to load vars for managed_node2 8119 1726773086.84680: Calling groups_inventory to load vars for managed_node2 8119 1726773086.84685: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.84712: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.84725: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.84736: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.84745: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.84756: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.84762: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.84771: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.84791: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.84811: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.85025: done with get_vars() 8119 1726773086.85036: done getting variables 8119 1726773086.85043: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.85045: done copying, going to template now 8119 1726773086.85047: done templating 8119 1726773086.85048: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.036) 0:01:21.407 **** 8119 1726773086.85064: sending task start callback 8119 1726773086.85066: entering _queue_task() for managed_node2/include_vars 8119 1726773086.85181: worker is 1 (out of 1 available) 8119 1726773086.85222: exiting _queue_task() for managed_node2/include_vars 8119 1726773086.85295: done queuing things up, now waiting for results queue to drain 8119 1726773086.85300: waiting for pending results... 11347 1726773086.85354: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11347 1726773086.85415: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000b80 11347 1726773086.85458: calling self._execute() 11347 1726773086.87372: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11347 1726773086.87451: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11347 1726773086.87504: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11347 1726773086.87535: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11347 1726773086.87561: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11347 1726773086.87592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11347 1726773086.87649: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11347 1726773086.87672: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11347 1726773086.87691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11347 1726773086.87771: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11347 1726773086.87790: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11347 1726773086.87805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11347 1726773086.88527: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup 11347 1726773086.88673: Loaded config def from plugin (lookup/first_found) 11347 1726773086.88678: Loading LookupModule 'first_found' from /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup/first_found.py 11347 1726773086.88732: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11347 1726773086.88764: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11347 1726773086.88776: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11347 1726773086.88789: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11347 1726773086.88795: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11347 1726773086.88884: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11347 1726773086.88895: starting attempt loop 11347 1726773086.88897: running the handler 11347 1726773086.88936: handler run complete 11347 1726773086.88940: attempt loop complete, returning result 11347 1726773086.88942: _execute() done 11347 1726773086.88944: dumping result to json 11347 1726773086.88946: done dumping result, returning 11347 1726773086.88950: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [12a3200b-1e9d-1dbd-cc52-000000000b80] 11347 1726773086.88958: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b80 11347 1726773086.88985: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000b80 11347 1726773086.88989: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8119 1726773086.89325: no more pending results, returning what we have 8119 1726773086.89328: results queue empty 8119 1726773086.89330: checking for any_errors_fatal 8119 1726773086.89333: done checking for any_errors_fatal 8119 1726773086.89334: checking for max_fail_percentage 8119 1726773086.89336: done checking for max_fail_percentage 8119 1726773086.89337: checking to see if all hosts have failed and the running result is not ok 8119 1726773086.89339: done checking to see if all hosts have failed 8119 1726773086.89340: getting the remaining hosts for this loop 8119 1726773086.89342: done getting the remaining hosts for this loop 8119 1726773086.89347: building list of next tasks for hosts 8119 1726773086.89349: getting the next task for host managed_node2 8119 1726773086.89355: done getting next task for host managed_node2 8119 1726773086.89358: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773086.89360: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.89362: done building task lists 8119 1726773086.89363: counting tasks in each state of execution 8119 1726773086.89366: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773086.89367: advancing hosts in ITERATING_TASKS 8119 1726773086.89369: starting to advance hosts 8119 1726773086.89370: getting the next task for host managed_node2 8119 1726773086.89373: done getting next task for host managed_node2 8119 1726773086.89375: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773086.89377: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773086.89378: done advancing hosts to next task 8119 1726773086.89392: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773086.89396: getting variables 8119 1726773086.89398: in VariableManager get_vars() 8119 1726773086.89427: Calling all_inventory to load vars for managed_node2 8119 1726773086.89431: Calling groups_inventory to load vars for managed_node2 8119 1726773086.89434: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773086.89455: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89464: Calling all_plugins_play to load vars for managed_node2 8119 1726773086.89474: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89485: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773086.89499: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89506: Calling groups_plugins_play to load vars for managed_node2 8119 1726773086.89517: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89539: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89554: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773086.89758: done with get_vars() 8119 1726773086.89769: done getting variables 8119 1726773086.89774: sending task start callback, copying the task so we can template it temporarily 8119 1726773086.89775: done copying, going to template now 8119 1726773086.89777: done templating 8119 1726773086.89778: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.047) 0:01:21.454 **** 8119 1726773086.89796: sending task start callback 8119 1726773086.89798: entering _queue_task() for managed_node2/package 8119 1726773086.89919: worker is 1 (out of 1 available) 8119 1726773086.89956: exiting _queue_task() for managed_node2/package 8119 1726773086.90025: done queuing things up, now waiting for results queue to drain 8119 1726773086.90031: waiting for pending results... 11349 1726773086.90095: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11349 1726773086.90145: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a2 11349 1726773086.90192: calling self._execute() 11349 1726773086.92115: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11349 1726773086.92240: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11349 1726773086.92321: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11349 1726773086.92348: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11349 1726773086.92373: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11349 1726773086.92402: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11349 1726773086.92454: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11349 1726773086.92476: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11349 1726773086.92498: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11349 1726773086.92588: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11349 1726773086.92606: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11349 1726773086.92623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11349 1726773086.92775: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11349 1726773086.92781: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11349 1726773086.92785: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11349 1726773086.92787: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11349 1726773086.92789: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11349 1726773086.92791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11349 1726773086.92793: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11349 1726773086.92794: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11349 1726773086.92796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11349 1726773086.92813: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11349 1726773086.92816: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11349 1726773086.92818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11349 1726773086.92981: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11349 1726773086.93024: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11349 1726773086.93035: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11349 1726773086.93045: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11349 1726773086.93050: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11349 1726773086.93160: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11349 1726773086.93169: starting attempt loop 11349 1726773086.93171: running the handler 11349 1726773086.93284: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 11349 1726773086.93297: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 11349 1726773086.93304: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 11349 1726773086.93313: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 11349 1726773086.93322: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 11349 1726773086.93347: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 11349 1726773086.93362: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 11349 1726773086.93367: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 11349 1726773086.93371: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 11349 1726773086.93375: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 11349 1726773086.93380: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 11349 1726773086.93389: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 11349 1726773086.93402: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 11349 1726773086.93412: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 11349 1726773086.93521: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 11349 1726773086.93529: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 11349 1726773086.93543: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 11349 1726773086.93549: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 11349 1726773086.93558: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 11349 1726773086.93636: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 11349 1726773086.93645: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 11349 1726773086.93747: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 11349 1726773086.93754: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 11349 1726773086.93760: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 11349 1726773086.93832: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 11349 1726773086.93839: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 11349 1726773086.93867: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 11349 1726773086.93878: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 11349 1726773086.93886: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 11349 1726773086.93893: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 11349 1726773086.93903: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 11349 1726773086.93912: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 11349 1726773086.93919: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 11349 1726773086.93925: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 11349 1726773086.93956: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 11349 1726773086.93962: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 11349 1726773086.93969: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 11349 1726773086.94137: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 11349 1726773086.94145: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 11349 1726773086.94149: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 11349 1726773086.94181: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 11349 1726773086.94641: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 11349 1726773086.94649: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 11349 1726773086.94656: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 11349 1726773086.94673: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 11349 1726773086.94690: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 11349 1726773086.94698: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 11349 1726773086.94704: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 11349 1726773086.94738: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 11349 1726773086.94758: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 11349 1726773086.94765: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 11349 1726773086.94770: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 11349 1726773086.94802: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 11349 1726773086.94812: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 11349 1726773086.94818: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 11349 1726773086.94842: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 11349 1726773086.94848: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 11349 1726773086.94854: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 11349 1726773086.94870: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 11349 1726773086.94925: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 11349 1726773086.94933: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 11349 1726773086.94941: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 11349 1726773086.94945: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 11349 1726773086.95028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 11349 1726773086.95058: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 11349 1726773086.95065: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 11349 1726773086.95073: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 11349 1726773086.95079: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 11349 1726773086.95105: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 11349 1726773086.95112: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 11349 1726773086.95120: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 11349 1726773086.95125: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 11349 1726773086.95131: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 11349 1726773086.95135: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 11349 1726773086.95144: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 11349 1726773086.95157: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 11349 1726773086.95163: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 11349 1726773086.95171: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 11349 1726773086.95178: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 11349 1726773086.95204: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 11349 1726773086.95211: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 11349 1726773086.95217: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 11349 1726773086.95287: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 11349 1726773086.95295: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 11349 1726773086.95306: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 11349 1726773086.95312: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 11349 1726773086.95318: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 11349 1726773086.95358: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 11349 1726773086.95363: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 11349 1726773086.95432: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 11349 1726773086.95438: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 11349 1726773086.95443: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 11349 1726773086.95487: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 11349 1726773086.95495: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 11349 1726773086.95526: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 11349 1726773086.95536: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 11349 1726773086.95541: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 11349 1726773086.95546: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 11349 1726773086.95551: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 11349 1726773086.95555: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 11349 1726773086.95559: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 11349 1726773086.95564: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 11349 1726773086.95582: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 11349 1726773086.95588: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 11349 1726773086.95593: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 11349 1726773086.95703: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 11349 1726773086.95711: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 11349 1726773086.95716: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 11349 1726773086.95737: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 11349 1726773086.96012: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 11349 1726773086.96020: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 11349 1726773086.96025: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 11349 1726773086.96037: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 11349 1726773086.96047: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 11349 1726773086.96051: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 11349 1726773086.96056: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 11349 1726773086.96077: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 11349 1726773086.96097: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 11349 1726773086.96104: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 11349 1726773086.96110: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 11349 1726773086.96130: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 11349 1726773086.96135: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 11349 1726773086.96139: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 11349 1726773086.96153: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 11349 1726773086.96157: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 11349 1726773086.96162: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 11349 1726773086.96173: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 11349 1726773086.96214: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 11349 1726773086.96222: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 11349 1726773086.96228: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 11349 1726773086.96232: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 11349 1726773086.96282: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 11349 1726773086.96306: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 11349 1726773086.96316: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 11349 1726773086.96322: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 11349 1726773086.96327: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 11349 1726773086.96344: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 11349 1726773086.96348: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 11349 1726773086.96352: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 11349 1726773086.96356: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 11349 1726773086.96360: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 11349 1726773086.96364: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 11349 1726773086.96369: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 11349 1726773086.96377: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 11349 1726773086.96381: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 11349 1726773086.96390: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 11349 1726773086.96395: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 11349 1726773086.96411: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 11349 1726773086.96437: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 11349 1726773086.96453: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 11349 1726773086.96507: _low_level_execute_command(): starting 11349 1726773086.96515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11349 1726773086.99052: stdout chunk (state=2): >>>/root <<< 11349 1726773086.99172: stderr chunk (state=3): >>><<< 11349 1726773086.99177: stdout chunk (state=3): >>><<< 11349 1726773086.99199: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11349 1726773086.99215: _low_level_execute_command(): starting 11349 1726773086.99220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406 `" && echo ansible-tmp-1726773086.992071-11349-14056441176406="` echo /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406 `" ) && sleep 0' 11349 1726773087.02137: stdout chunk (state=2): >>>ansible-tmp-1726773086.992071-11349-14056441176406=/root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406 <<< 11349 1726773087.02252: stderr chunk (state=3): >>><<< 11349 1726773087.02258: stdout chunk (state=3): >>><<< 11349 1726773087.02278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.992071-11349-14056441176406=/root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406 , stderr= 11349 1726773087.02394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/dnf-ZIP_DEFLATED 11349 1726773087.02456: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/AnsiballZ_dnf.py 11349 1726773087.02756: Sending initial data 11349 1726773087.02773: Sent initial data (149 bytes) 11349 1726773087.05239: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp4ua342l9 /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/AnsiballZ_dnf.py <<< 11349 1726773087.06447: stderr chunk (state=3): >>><<< 11349 1726773087.06456: stdout chunk (state=3): >>><<< 11349 1726773087.06479: done transferring module to remote 11349 1726773087.06496: _low_level_execute_command(): starting 11349 1726773087.06501: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/ /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/AnsiballZ_dnf.py && sleep 0' 11349 1726773087.09041: stderr chunk (state=2): >>><<< 11349 1726773087.09052: stdout chunk (state=2): >>><<< 11349 1726773087.09071: _low_level_execute_command() done: rc=0, stdout=, stderr= 11349 1726773087.09074: _low_level_execute_command(): starting 11349 1726773087.09080: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/AnsiballZ_dnf.py && sleep 0' 11349 1726773089.57894: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11349 1726773089.61016: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11349 1726773089.61065: stderr chunk (state=3): >>><<< 11349 1726773089.61071: stdout chunk (state=3): >>><<< 11349 1726773089.61093: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11349 1726773089.61131: done with _execute_module (dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11349 1726773089.61140: _low_level_execute_command(): starting 11349 1726773089.61145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.992071-11349-14056441176406/ > /dev/null 2>&1 && sleep 0' 11349 1726773089.63791: stderr chunk (state=2): >>><<< 11349 1726773089.63806: stdout chunk (state=2): >>><<< 11349 1726773089.63828: _low_level_execute_command() done: rc=0, stdout=, stderr= 11349 1726773089.63838: handler run complete 11349 1726773089.63879: attempt loop complete, returning result 11349 1726773089.63897: _execute() done 11349 1726773089.63900: dumping result to json 11349 1726773089.63904: done dumping result, returning 11349 1726773089.63922: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-0000000009a2] 11349 1726773089.63937: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a2 11349 1726773089.63974: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a2 11349 1726773089.63978: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773089.64203: no more pending results, returning what we have 8119 1726773089.64212: results queue empty 8119 1726773089.64214: checking for any_errors_fatal 8119 1726773089.64219: done checking for any_errors_fatal 8119 1726773089.64222: checking for max_fail_percentage 8119 1726773089.64225: done checking for max_fail_percentage 8119 1726773089.64226: checking to see if all hosts have failed and the running result is not ok 8119 1726773089.64228: done checking to see if all hosts have failed 8119 1726773089.64230: getting the remaining hosts for this loop 8119 1726773089.64233: done getting the remaining hosts for this loop 8119 1726773089.64240: building list of next tasks for hosts 8119 1726773089.64242: getting the next task for host managed_node2 8119 1726773089.64251: done getting next task for host managed_node2 8119 1726773089.64255: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773089.64259: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.64261: done building task lists 8119 1726773089.64263: counting tasks in each state of execution 8119 1726773089.64267: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773089.64270: advancing hosts in ITERATING_TASKS 8119 1726773089.64272: starting to advance hosts 8119 1726773089.64273: getting the next task for host managed_node2 8119 1726773089.64277: done getting next task for host managed_node2 8119 1726773089.64279: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773089.64281: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.64282: done advancing hosts to next task 8119 1726773089.64296: Loading ActionModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773089.64299: getting variables 8119 1726773089.64301: in VariableManager get_vars() 8119 1726773089.64330: Calling all_inventory to load vars for managed_node2 8119 1726773089.64334: Calling groups_inventory to load vars for managed_node2 8119 1726773089.64336: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773089.64359: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64369: Calling all_plugins_play to load vars for managed_node2 8119 1726773089.64386: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64399: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773089.64413: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64420: Calling groups_plugins_play to load vars for managed_node2 8119 1726773089.64430: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64448: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64461: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.64665: done with get_vars() 8119 1726773089.64676: done getting variables 8119 1726773089.64680: sending task start callback, copying the task so we can template it temporarily 8119 1726773089.64682: done copying, going to template now 8119 1726773089.64687: done templating 8119 1726773089.64688: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:29 -0400 (0:00:02.749) 0:01:24.203 **** 8119 1726773089.64704: sending task start callback 8119 1726773089.64706: entering _queue_task() for managed_node2/debug 8119 1726773089.64845: worker is 1 (out of 1 available) 8119 1726773089.64886: exiting _queue_task() for managed_node2/debug 8119 1726773089.64959: done queuing things up, now waiting for results queue to drain 8119 1726773089.64965: waiting for pending results... 11415 1726773089.65026: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11415 1726773089.65079: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a4 11415 1726773089.65127: calling self._execute() 11415 1726773089.66950: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11415 1726773089.67048: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11415 1726773089.67111: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11415 1726773089.67140: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11415 1726773089.67165: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11415 1726773089.67196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11415 1726773089.67246: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11415 1726773089.67269: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11415 1726773089.67288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11415 1726773089.67378: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11415 1726773089.67398: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11415 1726773089.67418: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11415 1726773089.67677: when evaluation is False, skipping this task 11415 1726773089.67682: _execute() done 11415 1726773089.67687: dumping result to json 11415 1726773089.67688: done dumping result, returning 11415 1726773089.67692: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000009a4] 11415 1726773089.67701: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a4 11415 1726773089.67728: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a4 11415 1726773089.67732: WORKER PROCESS EXITING skipping: [managed_node2] => {} 8119 1726773089.67952: no more pending results, returning what we have 8119 1726773089.67957: results queue empty 8119 1726773089.67959: checking for any_errors_fatal 8119 1726773089.67965: done checking for any_errors_fatal 8119 1726773089.67967: checking for max_fail_percentage 8119 1726773089.67970: done checking for max_fail_percentage 8119 1726773089.67972: checking to see if all hosts have failed and the running result is not ok 8119 1726773089.67974: done checking to see if all hosts have failed 8119 1726773089.67976: getting the remaining hosts for this loop 8119 1726773089.67979: done getting the remaining hosts for this loop 8119 1726773089.67988: building list of next tasks for hosts 8119 1726773089.67991: getting the next task for host managed_node2 8119 1726773089.67998: done getting next task for host managed_node2 8119 1726773089.68003: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773089.68007: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.68010: done building task lists 8119 1726773089.68012: counting tasks in each state of execution 8119 1726773089.68016: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773089.68018: advancing hosts in ITERATING_TASKS 8119 1726773089.68020: starting to advance hosts 8119 1726773089.68021: getting the next task for host managed_node2 8119 1726773089.68024: done getting next task for host managed_node2 8119 1726773089.68026: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773089.68028: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.68029: done advancing hosts to next task 8119 1726773089.68041: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773089.68044: getting variables 8119 1726773089.68046: in VariableManager get_vars() 8119 1726773089.68073: Calling all_inventory to load vars for managed_node2 8119 1726773089.68076: Calling groups_inventory to load vars for managed_node2 8119 1726773089.68078: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773089.68103: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68116: Calling all_plugins_play to load vars for managed_node2 8119 1726773089.68128: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68136: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773089.68147: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68153: Calling groups_plugins_play to load vars for managed_node2 8119 1726773089.68162: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68180: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68196: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.68410: done with get_vars() 8119 1726773089.68421: done getting variables 8119 1726773089.68426: sending task start callback, copying the task so we can template it temporarily 8119 1726773089.68428: done copying, going to template now 8119 1726773089.68431: done templating 8119 1726773089.68433: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.037) 0:01:24.240 **** 8119 1726773089.68450: sending task start callback 8119 1726773089.68451: entering _queue_task() for managed_node2/reboot 8119 1726773089.68577: worker is 1 (out of 1 available) 8119 1726773089.68615: exiting _queue_task() for managed_node2/reboot 8119 1726773089.68690: done queuing things up, now waiting for results queue to drain 8119 1726773089.68695: waiting for pending results... 11417 1726773089.68756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11417 1726773089.68813: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a5 11417 1726773089.68858: calling self._execute() 11417 1726773089.70959: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11417 1726773089.71072: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11417 1726773089.71155: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11417 1726773089.71202: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11417 1726773089.71250: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11417 1726773089.71280: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11417 1726773089.71354: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11417 1726773089.71377: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11417 1726773089.71397: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11417 1726773089.71482: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11417 1726773089.71504: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11417 1726773089.71523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11417 1726773089.71780: when evaluation is False, skipping this task 11417 1726773089.71787: _execute() done 11417 1726773089.71789: dumping result to json 11417 1726773089.71790: done dumping result, returning 11417 1726773089.71795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [12a3200b-1e9d-1dbd-cc52-0000000009a5] 11417 1726773089.71803: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a5 11417 1726773089.71830: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a5 11417 1726773089.71833: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773089.71973: no more pending results, returning what we have 8119 1726773089.71980: results queue empty 8119 1726773089.71982: checking for any_errors_fatal 8119 1726773089.71988: done checking for any_errors_fatal 8119 1726773089.71990: checking for max_fail_percentage 8119 1726773089.71993: done checking for max_fail_percentage 8119 1726773089.71995: checking to see if all hosts have failed and the running result is not ok 8119 1726773089.71997: done checking to see if all hosts have failed 8119 1726773089.71999: getting the remaining hosts for this loop 8119 1726773089.72002: done getting the remaining hosts for this loop 8119 1726773089.72010: building list of next tasks for hosts 8119 1726773089.72012: getting the next task for host managed_node2 8119 1726773089.72020: done getting next task for host managed_node2 8119 1726773089.72025: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773089.72029: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.72032: done building task lists 8119 1726773089.72034: counting tasks in each state of execution 8119 1726773089.72038: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773089.72040: advancing hosts in ITERATING_TASKS 8119 1726773089.72042: starting to advance hosts 8119 1726773089.72044: getting the next task for host managed_node2 8119 1726773089.72048: done getting next task for host managed_node2 8119 1726773089.72050: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773089.72053: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.72055: done advancing hosts to next task 8119 1726773089.72069: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773089.72074: getting variables 8119 1726773089.72077: in VariableManager get_vars() 8119 1726773089.72114: Calling all_inventory to load vars for managed_node2 8119 1726773089.72120: Calling groups_inventory to load vars for managed_node2 8119 1726773089.72123: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773089.72145: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72155: Calling all_plugins_play to load vars for managed_node2 8119 1726773089.72165: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72173: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773089.72186: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72196: Calling groups_plugins_play to load vars for managed_node2 8119 1726773089.72208: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72227: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72241: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.72455: done with get_vars() 8119 1726773089.72466: done getting variables 8119 1726773089.72471: sending task start callback, copying the task so we can template it temporarily 8119 1726773089.72472: done copying, going to template now 8119 1726773089.72474: done templating 8119 1726773089.72475: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.040) 0:01:24.281 **** 8119 1726773089.72494: sending task start callback 8119 1726773089.72496: entering _queue_task() for managed_node2/fail 8119 1726773089.72615: worker is 1 (out of 1 available) 8119 1726773089.72654: exiting _queue_task() for managed_node2/fail 8119 1726773089.72729: done queuing things up, now waiting for results queue to drain 8119 1726773089.72734: waiting for pending results... 11420 1726773089.72799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11420 1726773089.72853: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a6 11420 1726773089.72898: calling self._execute() 11420 1726773089.75501: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11420 1726773089.75617: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11420 1726773089.75700: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11420 1726773089.75746: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11420 1726773089.75791: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11420 1726773089.75850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11420 1726773089.75923: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11420 1726773089.75956: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11420 1726773089.75981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11420 1726773089.76212: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11420 1726773089.76238: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11420 1726773089.76259: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11420 1726773089.76626: when evaluation is False, skipping this task 11420 1726773089.76632: _execute() done 11420 1726773089.76635: dumping result to json 11420 1726773089.76638: done dumping result, returning 11420 1726773089.76644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [12a3200b-1e9d-1dbd-cc52-0000000009a6] 11420 1726773089.76657: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a6 11420 1726773089.76688: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a6 11420 1726773089.76692: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773089.76878: no more pending results, returning what we have 8119 1726773089.76884: results queue empty 8119 1726773089.76887: checking for any_errors_fatal 8119 1726773089.76892: done checking for any_errors_fatal 8119 1726773089.76894: checking for max_fail_percentage 8119 1726773089.76897: done checking for max_fail_percentage 8119 1726773089.76900: checking to see if all hosts have failed and the running result is not ok 8119 1726773089.76903: done checking to see if all hosts have failed 8119 1726773089.76904: getting the remaining hosts for this loop 8119 1726773089.76907: done getting the remaining hosts for this loop 8119 1726773089.76915: building list of next tasks for hosts 8119 1726773089.76918: getting the next task for host managed_node2 8119 1726773089.76927: done getting next task for host managed_node2 8119 1726773089.76933: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773089.76937: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.76940: done building task lists 8119 1726773089.76942: counting tasks in each state of execution 8119 1726773089.76946: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773089.76948: advancing hosts in ITERATING_TASKS 8119 1726773089.76950: starting to advance hosts 8119 1726773089.76952: getting the next task for host managed_node2 8119 1726773089.76957: done getting next task for host managed_node2 8119 1726773089.76960: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773089.76962: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773089.76964: done advancing hosts to next task 8119 1726773089.77008: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773089.77015: getting variables 8119 1726773089.77018: in VariableManager get_vars() 8119 1726773089.77051: Calling all_inventory to load vars for managed_node2 8119 1726773089.77056: Calling groups_inventory to load vars for managed_node2 8119 1726773089.77059: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773089.77086: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77100: Calling all_plugins_play to load vars for managed_node2 8119 1726773089.77116: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77128: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773089.77143: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77151: Calling groups_plugins_play to load vars for managed_node2 8119 1726773089.77164: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77191: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77210: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773089.77524: done with get_vars() 8119 1726773089.77537: done getting variables 8119 1726773089.77542: sending task start callback, copying the task so we can template it temporarily 8119 1726773089.77544: done copying, going to template now 8119 1726773089.77547: done templating 8119 1726773089.77549: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.050) 0:01:24.332 **** 8119 1726773089.77570: sending task start callback 8119 1726773089.77573: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773089.77709: worker is 1 (out of 1 available) 8119 1726773089.77744: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773089.77813: done queuing things up, now waiting for results queue to drain 8119 1726773089.77819: waiting for pending results... 11427 1726773089.78058: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11427 1726773089.78126: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a8 11427 1726773089.78181: calling self._execute() 11427 1726773089.80661: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11427 1726773089.80775: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11427 1726773089.80848: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11427 1726773089.80893: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11427 1726773089.80934: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11427 1726773089.80976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11427 1726773089.81030: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11427 1726773089.81053: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11427 1726773089.81068: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11427 1726773089.81169: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11427 1726773089.81190: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11427 1726773089.81214: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11427 1726773089.81454: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11427 1726773089.81495: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11427 1726773089.81506: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11427 1726773089.81518: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11427 1726773089.81524: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11427 1726773089.81606: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11427 1726773089.81620: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11427 1726773089.81644: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11427 1726773089.81659: starting attempt loop 11427 1726773089.81661: running the handler 11427 1726773089.81669: _low_level_execute_command(): starting 11427 1726773089.81673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11427 1726773089.84206: stdout chunk (state=2): >>>/root <<< 11427 1726773089.84323: stderr chunk (state=3): >>><<< 11427 1726773089.84328: stdout chunk (state=3): >>><<< 11427 1726773089.84347: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11427 1726773089.84363: _low_level_execute_command(): starting 11427 1726773089.84370: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447 `" && echo ansible-tmp-1726773089.8435714-11427-67471146755447="` echo /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447 `" ) && sleep 0' 11427 1726773089.87141: stdout chunk (state=2): >>>ansible-tmp-1726773089.8435714-11427-67471146755447=/root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447 <<< 11427 1726773089.87346: stderr chunk (state=3): >>><<< 11427 1726773089.87352: stdout chunk (state=3): >>><<< 11427 1726773089.87370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773089.8435714-11427-67471146755447=/root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447 , stderr= 11427 1726773089.87453: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 11427 1726773089.87509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/AnsiballZ_kernel_settings_get_config.py 11427 1726773089.87828: Sending initial data 11427 1726773089.87842: Sent initial data (173 bytes) 11427 1726773089.90389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8tkwdq_c /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/AnsiballZ_kernel_settings_get_config.py <<< 11427 1726773089.91445: stderr chunk (state=3): >>><<< 11427 1726773089.91450: stdout chunk (state=3): >>><<< 11427 1726773089.91471: done transferring module to remote 11427 1726773089.91487: _low_level_execute_command(): starting 11427 1726773089.91492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/ /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11427 1726773089.93991: stderr chunk (state=2): >>><<< 11427 1726773089.94002: stdout chunk (state=2): >>><<< 11427 1726773089.94022: _low_level_execute_command() done: rc=0, stdout=, stderr= 11427 1726773089.94027: _low_level_execute_command(): starting 11427 1726773089.94033: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11427 1726773090.09171: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11427 1726773090.10172: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11427 1726773090.10222: stderr chunk (state=3): >>><<< 11427 1726773090.10228: stdout chunk (state=3): >>><<< 11427 1726773090.10250: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 11427 1726773090.10277: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11427 1726773090.10291: _low_level_execute_command(): starting 11427 1726773090.10298: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773089.8435714-11427-67471146755447/ > /dev/null 2>&1 && sleep 0' 11427 1726773090.13497: stderr chunk (state=2): >>><<< 11427 1726773090.13516: stdout chunk (state=2): >>><<< 11427 1726773090.13547: _low_level_execute_command() done: rc=0, stdout=, stderr= 11427 1726773090.13557: handler run complete 11427 1726773090.13596: attempt loop complete, returning result 11427 1726773090.13619: _execute() done 11427 1726773090.13623: dumping result to json 11427 1726773090.13628: done dumping result, returning 11427 1726773090.13646: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [12a3200b-1e9d-1dbd-cc52-0000000009a8] 11427 1726773090.13663: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a8 11427 1726773090.13730: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a8 11427 1726773090.13735: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8119 1726773090.14161: no more pending results, returning what we have 8119 1726773090.14167: results queue empty 8119 1726773090.14170: checking for any_errors_fatal 8119 1726773090.14175: done checking for any_errors_fatal 8119 1726773090.14178: checking for max_fail_percentage 8119 1726773090.14181: done checking for max_fail_percentage 8119 1726773090.14186: checking to see if all hosts have failed and the running result is not ok 8119 1726773090.14188: done checking to see if all hosts have failed 8119 1726773090.14190: getting the remaining hosts for this loop 8119 1726773090.14193: done getting the remaining hosts for this loop 8119 1726773090.14201: building list of next tasks for hosts 8119 1726773090.14204: getting the next task for host managed_node2 8119 1726773090.14214: done getting next task for host managed_node2 8119 1726773090.14219: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773090.14224: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.14227: done building task lists 8119 1726773090.14229: counting tasks in each state of execution 8119 1726773090.14233: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773090.14236: advancing hosts in ITERATING_TASKS 8119 1726773090.14238: starting to advance hosts 8119 1726773090.14241: getting the next task for host managed_node2 8119 1726773090.14245: done getting next task for host managed_node2 8119 1726773090.14249: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773090.14252: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.14254: done advancing hosts to next task 8119 1726773090.14269: getting variables 8119 1726773090.14273: in VariableManager get_vars() 8119 1726773090.14316: Calling all_inventory to load vars for managed_node2 8119 1726773090.14323: Calling groups_inventory to load vars for managed_node2 8119 1726773090.14327: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773090.14357: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14374: Calling all_plugins_play to load vars for managed_node2 8119 1726773090.14396: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14414: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773090.14434: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14446: Calling groups_plugins_play to load vars for managed_node2 8119 1726773090.14463: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14495: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14524: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.14893: done with get_vars() 8119 1726773090.14907: done getting variables 8119 1726773090.14917: sending task start callback, copying the task so we can template it temporarily 8119 1726773090.14920: done copying, going to template now 8119 1726773090.14923: done templating 8119 1726773090.14925: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.373) 0:01:24.705 **** 8119 1726773090.14950: sending task start callback 8119 1726773090.14953: entering _queue_task() for managed_node2/stat 8119 1726773090.15137: worker is 1 (out of 1 available) 8119 1726773090.15174: exiting _queue_task() for managed_node2/stat 8119 1726773090.15255: done queuing things up, now waiting for results queue to drain 8119 1726773090.15260: waiting for pending results... 11445 1726773090.15700: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11445 1726773090.15766: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009a9 11445 1726773090.18200: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11445 1726773090.18306: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11445 1726773090.18392: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11445 1726773090.18431: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11445 1726773090.18468: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11445 1726773090.18505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11445 1726773090.18562: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11445 1726773090.18591: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11445 1726773090.18615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11445 1726773090.18717: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11445 1726773090.18739: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11445 1726773090.18757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11445 1726773090.19248: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11445 1726773090.19254: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11445 1726773090.19257: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11445 1726773090.19261: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11445 1726773090.19264: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11445 1726773090.19267: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.19270: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11445 1726773090.19273: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11445 1726773090.19276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11445 1726773090.19301: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11445 1726773090.19306: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11445 1726773090.19312: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.19695: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11445 1726773090.19702: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11445 1726773090.19706: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11445 1726773090.19712: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11445 1726773090.19715: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11445 1726773090.19719: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.19722: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11445 1726773090.19724: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11445 1726773090.19727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11445 1726773090.19755: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11445 1726773090.19760: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11445 1726773090.19763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.20016: when evaluation is False, skipping this task 11445 1726773090.20064: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11445 1726773090.20069: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11445 1726773090.20073: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11445 1726773090.20076: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11445 1726773090.20079: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11445 1726773090.20084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.20088: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11445 1726773090.20091: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11445 1726773090.20094: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11445 1726773090.20125: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11445 1726773090.20130: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11445 1726773090.20133: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.20351: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11445 1726773090.20357: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11445 1726773090.20360: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11445 1726773090.20364: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11445 1726773090.20367: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11445 1726773090.20371: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.20374: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11445 1726773090.20377: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11445 1726773090.20380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11445 1726773090.20411: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11445 1726773090.20417: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11445 1726773090.20420: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.20784: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11445 1726773090.20837: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11445 1726773090.20852: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11445 1726773090.20869: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726773090.20877: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726773090.20995: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11445 1726773090.21020: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11445 1726773090.21056: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11445 1726773090.21076: starting attempt loop 11445 1726773090.21080: running the handler 11445 1726773090.21091: _low_level_execute_command(): starting 11445 1726773090.21098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "item": "", "skip_reason": "Conditional result was False" } 11445 1726773090.24014: stdout chunk (state=2): >>>/root <<< 11445 1726773090.24203: stderr chunk (state=3): >>><<< 11445 1726773090.24213: stdout chunk (state=3): >>><<< 11445 1726773090.24238: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11445 1726773090.24255: _low_level_execute_command(): starting 11445 1726773090.24263: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443 `" && echo ansible-tmp-1726773090.2424824-11445-109226349860443="` echo /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443 `" ) && sleep 0' 11445 1726773090.27606: stdout chunk (state=2): >>>ansible-tmp-1726773090.2424824-11445-109226349860443=/root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443 <<< 11445 1726773090.27690: stderr chunk (state=3): >>><<< 11445 1726773090.27698: stdout chunk (state=3): >>><<< 11445 1726773090.27724: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773090.2424824-11445-109226349860443=/root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443 , stderr= 11445 1726773090.27840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11445 1726773090.27914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/AnsiballZ_stat.py 11445 1726773090.28649: Sending initial data 11445 1726773090.28666: Sent initial data (152 bytes) 11445 1726773090.31289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpssthzcvq /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/AnsiballZ_stat.py <<< 11445 1726773090.32513: stderr chunk (state=3): >>><<< 11445 1726773090.32520: stdout chunk (state=3): >>><<< 11445 1726773090.32550: done transferring module to remote 11445 1726773090.32568: _low_level_execute_command(): starting 11445 1726773090.32573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/ /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/AnsiballZ_stat.py && sleep 0' 11445 1726773090.35167: stderr chunk (state=2): >>><<< 11445 1726773090.35181: stdout chunk (state=2): >>><<< 11445 1726773090.35209: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726773090.35215: _low_level_execute_command(): starting 11445 1726773090.35223: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/AnsiballZ_stat.py && sleep 0' 11445 1726773090.49870: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11445 1726773090.50790: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11445 1726773090.50841: stderr chunk (state=3): >>><<< 11445 1726773090.50846: stdout chunk (state=3): >>><<< 11445 1726773090.50866: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 11445 1726773090.50892: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11445 1726773090.50904: _low_level_execute_command(): starting 11445 1726773090.50911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773090.2424824-11445-109226349860443/ > /dev/null 2>&1 && sleep 0' 11445 1726773090.53753: stderr chunk (state=2): >>><<< 11445 1726773090.53764: stdout chunk (state=2): >>><<< 11445 1726773090.53784: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726773090.53793: handler run complete 11445 1726773090.53823: attempt loop complete, returning result 11445 1726773090.54147: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11445 1726773090.54153: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11445 1726773090.54156: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11445 1726773090.54158: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11445 1726773090.54160: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11445 1726773090.54162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11445 1726773090.54164: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11445 1726773090.54166: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11445 1726773090.54168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11445 1726773090.54193: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11445 1726773090.54197: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11445 1726773090.54199: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11445 1726773090.54520: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11445 1726773090.54527: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726773090.54531: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726773090.54627: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11445 1726773090.54648: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11445 1726773090.54654: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11445 1726773090.54660: starting attempt loop 11445 1726773090.54662: running the handler 11445 1726773090.54667: _low_level_execute_command(): starting 11445 1726773090.54670: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11445 1726773090.57314: stdout chunk (state=2): >>>/root <<< 11445 1726773090.57431: stderr chunk (state=3): >>><<< 11445 1726773090.57436: stdout chunk (state=3): >>><<< 11445 1726773090.57452: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11445 1726773090.57464: _low_level_execute_command(): starting 11445 1726773090.57470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396 `" && echo ansible-tmp-1726773090.5745962-11445-128425323952396="` echo /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396 `" ) && sleep 0' 11445 1726773090.60236: stdout chunk (state=2): >>>ansible-tmp-1726773090.5745962-11445-128425323952396=/root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396 <<< 11445 1726773090.60361: stderr chunk (state=3): >>><<< 11445 1726773090.60366: stdout chunk (state=3): >>><<< 11445 1726773090.60384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773090.5745962-11445-128425323952396=/root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396 , stderr= 11445 1726773090.60468: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11445 1726773090.60520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/AnsiballZ_stat.py 11445 1726773090.60803: Sending initial data 11445 1726773090.60820: Sent initial data (152 bytes) 11445 1726773090.63226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqt5q24y7 /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/AnsiballZ_stat.py <<< 11445 1726773090.64222: stderr chunk (state=3): >>><<< 11445 1726773090.64228: stdout chunk (state=3): >>><<< 11445 1726773090.64250: done transferring module to remote 11445 1726773090.64265: _low_level_execute_command(): starting 11445 1726773090.64270: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/ /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/AnsiballZ_stat.py && sleep 0' 11445 1726773090.66805: stderr chunk (state=2): >>><<< 11445 1726773090.66817: stdout chunk (state=2): >>><<< 11445 1726773090.66835: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726773090.66839: _low_level_execute_command(): starting 11445 1726773090.66847: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/AnsiballZ_stat.py && sleep 0' 11445 1726773090.81932: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11445 1726773090.82954: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11445 1726773090.83004: stderr chunk (state=3): >>><<< 11445 1726773090.83010: stdout chunk (state=3): >>><<< 11445 1726773090.83030: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 11445 1726773090.83087: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11445 1726773090.83099: _low_level_execute_command(): starting 11445 1726773090.83104: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773090.5745962-11445-128425323952396/ > /dev/null 2>&1 && sleep 0' 11445 1726773090.85739: stderr chunk (state=2): >>><<< 11445 1726773090.85751: stdout chunk (state=2): >>><<< 11445 1726773090.85775: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726773090.85785: handler run complete 11445 1726773090.85831: attempt loop complete, returning result 11445 1726773090.86004: dumping result to json 11445 1726773090.86065: done dumping result, returning 11445 1726773090.86078: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [12a3200b-1e9d-1dbd-cc52-0000000009a9] 11445 1726773090.86089: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a9 11445 1726773090.86093: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009a9 11445 1726773090.86095: WORKER PROCESS EXITING ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773035.2883239, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773033.0853279, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773033.0853279, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8119 1726773090.86485: no more pending results, returning what we have 8119 1726773090.86492: results queue empty 8119 1726773090.86494: checking for any_errors_fatal 8119 1726773090.86498: done checking for any_errors_fatal 8119 1726773090.86499: checking for max_fail_percentage 8119 1726773090.86501: done checking for max_fail_percentage 8119 1726773090.86503: checking to see if all hosts have failed and the running result is not ok 8119 1726773090.86504: done checking to see if all hosts have failed 8119 1726773090.86505: getting the remaining hosts for this loop 8119 1726773090.86507: done getting the remaining hosts for this loop 8119 1726773090.86512: building list of next tasks for hosts 8119 1726773090.86515: getting the next task for host managed_node2 8119 1726773090.86520: done getting next task for host managed_node2 8119 1726773090.86522: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773090.86525: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.86527: done building task lists 8119 1726773090.86528: counting tasks in each state of execution 8119 1726773090.86531: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773090.86532: advancing hosts in ITERATING_TASKS 8119 1726773090.86534: starting to advance hosts 8119 1726773090.86535: getting the next task for host managed_node2 8119 1726773090.86537: done getting next task for host managed_node2 8119 1726773090.86539: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773090.86542: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.86544: done advancing hosts to next task 8119 1726773090.86557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773090.86560: getting variables 8119 1726773090.86563: in VariableManager get_vars() 8119 1726773090.86592: Calling all_inventory to load vars for managed_node2 8119 1726773090.86596: Calling groups_inventory to load vars for managed_node2 8119 1726773090.86598: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773090.86621: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86632: Calling all_plugins_play to load vars for managed_node2 8119 1726773090.86642: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86651: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773090.86665: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86673: Calling groups_plugins_play to load vars for managed_node2 8119 1726773090.86685: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86705: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86727: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.86993: done with get_vars() 8119 1726773090.87006: done getting variables 8119 1726773090.87017: sending task start callback, copying the task so we can template it temporarily 8119 1726773090.87019: done copying, going to template now 8119 1726773090.87022: done templating 8119 1726773090.87024: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.720) 0:01:25.426 **** 8119 1726773090.87046: sending task start callback 8119 1726773090.87049: entering _queue_task() for managed_node2/set_fact 8119 1726773090.87212: worker is 1 (out of 1 available) 8119 1726773090.87249: exiting _queue_task() for managed_node2/set_fact 8119 1726773090.87332: done queuing things up, now waiting for results queue to drain 8119 1726773090.87338: waiting for pending results... 11481 1726773090.87561: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11481 1726773090.87636: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009aa 11481 1726773090.87691: calling self._execute() 11481 1726773090.89835: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11481 1726773090.89924: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11481 1726773090.89988: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11481 1726773090.90023: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11481 1726773090.90050: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11481 1726773090.90078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11481 1726773090.90133: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11481 1726773090.90156: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11481 1726773090.90172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11481 1726773090.90259: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11481 1726773090.90275: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11481 1726773090.90291: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11481 1726773090.90746: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11481 1726773090.90779: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11481 1726773090.90792: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11481 1726773090.90806: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11481 1726773090.90814: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11481 1726773090.90912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11481 1726773090.90931: starting attempt loop 11481 1726773090.90933: running the handler 11481 1726773090.90945: handler run complete 11481 1726773090.90948: attempt loop complete, returning result 11481 1726773090.90950: _execute() done 11481 1726773090.90951: dumping result to json 11481 1726773090.90953: done dumping result, returning 11481 1726773090.90959: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [12a3200b-1e9d-1dbd-cc52-0000000009aa] 11481 1726773090.90969: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009aa 11481 1726773090.90998: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009aa 11481 1726773090.91002: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8119 1726773090.91302: no more pending results, returning what we have 8119 1726773090.91308: results queue empty 8119 1726773090.91310: checking for any_errors_fatal 8119 1726773090.91318: done checking for any_errors_fatal 8119 1726773090.91319: checking for max_fail_percentage 8119 1726773090.91323: done checking for max_fail_percentage 8119 1726773090.91325: checking to see if all hosts have failed and the running result is not ok 8119 1726773090.91327: done checking to see if all hosts have failed 8119 1726773090.91329: getting the remaining hosts for this loop 8119 1726773090.91331: done getting the remaining hosts for this loop 8119 1726773090.91338: building list of next tasks for hosts 8119 1726773090.91341: getting the next task for host managed_node2 8119 1726773090.91348: done getting next task for host managed_node2 8119 1726773090.91351: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773090.91355: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.91358: done building task lists 8119 1726773090.91359: counting tasks in each state of execution 8119 1726773090.91363: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773090.91365: advancing hosts in ITERATING_TASKS 8119 1726773090.91367: starting to advance hosts 8119 1726773090.91369: getting the next task for host managed_node2 8119 1726773090.91373: done getting next task for host managed_node2 8119 1726773090.91376: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773090.91379: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773090.91381: done advancing hosts to next task 8119 1726773090.91397: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773090.91402: getting variables 8119 1726773090.91405: in VariableManager get_vars() 8119 1726773090.91439: Calling all_inventory to load vars for managed_node2 8119 1726773090.91445: Calling groups_inventory to load vars for managed_node2 8119 1726773090.91449: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773090.91477: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91495: Calling all_plugins_play to load vars for managed_node2 8119 1726773090.91511: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91521: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773090.91533: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91539: Calling groups_plugins_play to load vars for managed_node2 8119 1726773090.91548: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91566: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91580: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773090.91923: done with get_vars() 8119 1726773090.91942: done getting variables 8119 1726773090.91950: sending task start callback, copying the task so we can template it temporarily 8119 1726773090.91953: done copying, going to template now 8119 1726773090.91956: done templating 8119 1726773090.91958: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.049) 0:01:25.476 **** 8119 1726773090.91984: sending task start callback 8119 1726773090.91987: entering _queue_task() for managed_node2/service 8119 1726773090.92144: worker is 1 (out of 1 available) 8119 1726773090.92189: exiting _queue_task() for managed_node2/service 8119 1726773090.92265: done queuing things up, now waiting for results queue to drain 8119 1726773090.92269: waiting for pending results... 11485 1726773090.92394: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11485 1726773090.92450: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009ab 11485 1726773090.94545: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11485 1726773090.94635: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11485 1726773090.94700: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11485 1726773090.94731: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11485 1726773090.94761: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11485 1726773090.94794: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11485 1726773090.94843: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11485 1726773090.94868: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11485 1726773090.94889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11485 1726773090.94969: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11485 1726773090.94992: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11485 1726773090.95010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11485 1726773090.95174: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11485 1726773090.95178: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11485 1726773090.95180: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11485 1726773090.95184: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11485 1726773090.95186: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11485 1726773090.95188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773090.95190: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11485 1726773090.95192: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11485 1726773090.95193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11485 1726773090.95214: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11485 1726773090.95217: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11485 1726773090.95219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773090.95384: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11485 1726773090.95390: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11485 1726773090.95394: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11485 1726773090.95396: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11485 1726773090.95398: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11485 1726773090.95399: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773090.95401: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11485 1726773090.95403: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11485 1726773090.95404: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11485 1726773090.95426: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11485 1726773090.95429: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11485 1726773090.95432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773090.95541: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11485 1726773090.95574: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11485 1726773090.95587: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11485 1726773090.95599: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11485 1726773090.95605: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11485 1726773090.95713: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11485 1726773090.95732: starting attempt loop 11485 1726773090.95736: running the handler 11485 1726773090.95862: _low_level_execute_command(): starting 11485 1726773090.95868: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11485 1726773090.98600: stdout chunk (state=2): >>>/root <<< 11485 1726773090.98762: stderr chunk (state=3): >>><<< 11485 1726773090.98768: stdout chunk (state=3): >>><<< 11485 1726773090.98792: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11485 1726773090.98811: _low_level_execute_command(): starting 11485 1726773090.98818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125 `" && echo ansible-tmp-1726773090.9880152-11485-117456091011125="` echo /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125 `" ) && sleep 0' 11485 1726773091.01693: stdout chunk (state=2): >>>ansible-tmp-1726773090.9880152-11485-117456091011125=/root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125 <<< 11485 1726773091.01899: stderr chunk (state=3): >>><<< 11485 1726773091.01905: stdout chunk (state=3): >>><<< 11485 1726773091.01927: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773090.9880152-11485-117456091011125=/root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125 , stderr= 11485 1726773091.02047: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 11485 1726773091.02148: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/AnsiballZ_systemd.py 11485 1726773091.02518: Sending initial data 11485 1726773091.02534: Sent initial data (155 bytes) 11485 1726773091.05033: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmph6ihlpt7 /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/AnsiballZ_systemd.py <<< 11485 1726773091.06870: stderr chunk (state=3): >>><<< 11485 1726773091.06877: stdout chunk (state=3): >>><<< 11485 1726773091.06904: done transferring module to remote 11485 1726773091.06920: _low_level_execute_command(): starting 11485 1726773091.06924: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/ /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/AnsiballZ_systemd.py && sleep 0' 11485 1726773091.09574: stderr chunk (state=2): >>><<< 11485 1726773091.09588: stdout chunk (state=2): >>><<< 11485 1726773091.09607: _low_level_execute_command() done: rc=0, stdout=, stderr= 11485 1726773091.09611: _low_level_execute_command(): starting 11485 1726773091.09618: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/AnsiballZ_systemd.py && sleep 0' 11485 1726773091.35702: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18890752", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11485 1726773091.35751: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChange<<< 11485 1726773091.35760: stdout chunk (state=3): >>>TimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 11485 1726773091.37186: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11485 1726773091.37194: stdout chunk (state=3): >>><<< 11485 1726773091.37201: stderr chunk (state=3): >>><<< 11485 1726773091.37218: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18890752", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11485 1726773091.37357: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11485 1726773091.37373: _low_level_execute_command(): starting 11485 1726773091.37378: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773090.9880152-11485-117456091011125/ > /dev/null 2>&1 && sleep 0' 11485 1726773091.40095: stderr chunk (state=2): >>><<< 11485 1726773091.40113: stdout chunk (state=2): >>><<< 11485 1726773091.40136: _low_level_execute_command() done: rc=0, stdout=, stderr= 11485 1726773091.40146: handler run complete 11485 1726773091.40153: attempt loop complete, returning result 11485 1726773091.40245: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11485 1726773091.40254: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11485 1726773091.40260: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11485 1726773091.40264: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11485 1726773091.40268: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11485 1726773091.40272: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773091.40276: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11485 1726773091.40280: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11485 1726773091.40285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11485 1726773091.40340: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11485 1726773091.40346: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11485 1726773091.40350: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11485 1726773091.40570: dumping result to json 11485 1726773091.40598: done dumping result, returning 11485 1726773091.40618: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-0000000009ab] 11485 1726773091.40631: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ab 11485 1726773091.40635: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ab 11485 1726773091.40638: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "658", "MemoryAccounting": "yes", "MemoryCurrent": "18890752", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "WatchdogUSec": "0" } } 8119 1726773091.41640: no more pending results, returning what we have 8119 1726773091.41647: results queue empty 8119 1726773091.41650: checking for any_errors_fatal 8119 1726773091.41654: done checking for any_errors_fatal 8119 1726773091.41656: checking for max_fail_percentage 8119 1726773091.41659: done checking for max_fail_percentage 8119 1726773091.41661: checking to see if all hosts have failed and the running result is not ok 8119 1726773091.41663: done checking to see if all hosts have failed 8119 1726773091.41665: getting the remaining hosts for this loop 8119 1726773091.41667: done getting the remaining hosts for this loop 8119 1726773091.41674: building list of next tasks for hosts 8119 1726773091.41676: getting the next task for host managed_node2 8119 1726773091.41685: done getting next task for host managed_node2 8119 1726773091.41689: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773091.41693: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773091.41695: done building task lists 8119 1726773091.41697: counting tasks in each state of execution 8119 1726773091.41701: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773091.41703: advancing hosts in ITERATING_TASKS 8119 1726773091.41705: starting to advance hosts 8119 1726773091.41707: getting the next task for host managed_node2 8119 1726773091.41713: done getting next task for host managed_node2 8119 1726773091.41716: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773091.41719: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773091.41721: done advancing hosts to next task 8119 1726773091.41734: getting variables 8119 1726773091.41738: in VariableManager get_vars() 8119 1726773091.41770: Calling all_inventory to load vars for managed_node2 8119 1726773091.41775: Calling groups_inventory to load vars for managed_node2 8119 1726773091.41778: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773091.41806: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.41823: Calling all_plugins_play to load vars for managed_node2 8119 1726773091.41837: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.41849: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773091.41864: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.41873: Calling groups_plugins_play to load vars for managed_node2 8119 1726773091.41888: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.41917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.41937: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.42245: done with get_vars() 8119 1726773091.42259: done getting variables 8119 1726773091.42266: sending task start callback, copying the task so we can template it temporarily 8119 1726773091.42269: done copying, going to template now 8119 1726773091.42272: done templating 8119 1726773091.42274: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.503) 0:01:25.979 **** 8119 1726773091.42301: sending task start callback 8119 1726773091.42304: entering _queue_task() for managed_node2/file 8119 1726773091.42469: worker is 1 (out of 1 available) 8119 1726773091.42512: exiting _queue_task() for managed_node2/file 8119 1726773091.42587: done queuing things up, now waiting for results queue to drain 8119 1726773091.42593: waiting for pending results... 11503 1726773091.42824: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11503 1726773091.42893: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009ac 11503 1726773091.42952: calling self._execute() 11503 1726773091.45536: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11503 1726773091.45630: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11503 1726773091.45688: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11503 1726773091.45724: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11503 1726773091.45752: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11503 1726773091.45780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11503 1726773091.45858: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11503 1726773091.45892: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11503 1726773091.45915: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11503 1726773091.46010: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11503 1726773091.46030: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11503 1726773091.46044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11503 1726773091.46296: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11503 1726773091.46339: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11503 1726773091.46351: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11503 1726773091.46361: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11503 1726773091.46366: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11503 1726773091.46457: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11503 1726773091.46471: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11503 1726773091.46495: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11503 1726773091.46514: starting attempt loop 11503 1726773091.46516: running the handler 11503 1726773091.46526: _low_level_execute_command(): starting 11503 1726773091.46531: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11503 1726773091.49031: stdout chunk (state=2): >>>/root <<< 11503 1726773091.49146: stderr chunk (state=3): >>><<< 11503 1726773091.49151: stdout chunk (state=3): >>><<< 11503 1726773091.49172: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11503 1726773091.49190: _low_level_execute_command(): starting 11503 1726773091.49200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578 `" && echo ansible-tmp-1726773091.4918096-11503-247089474251578="` echo /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578 `" ) && sleep 0' 11503 1726773091.51908: stdout chunk (state=2): >>>ansible-tmp-1726773091.4918096-11503-247089474251578=/root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578 <<< 11503 1726773091.52040: stderr chunk (state=3): >>><<< 11503 1726773091.52047: stdout chunk (state=3): >>><<< 11503 1726773091.52075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.4918096-11503-247089474251578=/root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578 , stderr= 11503 1726773091.52174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11503 1726773091.52240: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/AnsiballZ_file.py 11503 1726773091.52611: Sending initial data 11503 1726773091.52626: Sent initial data (152 bytes) 11503 1726773091.55096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp2_5ixhmh /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/AnsiballZ_file.py <<< 11503 1726773091.56443: stderr chunk (state=3): >>><<< 11503 1726773091.56451: stdout chunk (state=3): >>><<< 11503 1726773091.56477: done transferring module to remote 11503 1726773091.56496: _low_level_execute_command(): starting 11503 1726773091.56502: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/ /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/AnsiballZ_file.py && sleep 0' 11503 1726773091.59091: stderr chunk (state=2): >>><<< 11503 1726773091.59105: stdout chunk (state=2): >>><<< 11503 1726773091.59126: _low_level_execute_command() done: rc=0, stdout=, stderr= 11503 1726773091.59130: _low_level_execute_command(): starting 11503 1726773091.59137: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/AnsiballZ_file.py && sleep 0' 11503 1726773091.74928: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11503 1726773091.75964: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11503 1726773091.76030: stderr chunk (state=3): >>><<< 11503 1726773091.76036: stdout chunk (state=3): >>><<< 11503 1726773091.76056: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11503 1726773091.76092: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11503 1726773091.76107: _low_level_execute_command(): starting 11503 1726773091.76118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.4918096-11503-247089474251578/ > /dev/null 2>&1 && sleep 0' 11503 1726773091.80362: stderr chunk (state=2): >>><<< 11503 1726773091.80378: stdout chunk (state=2): >>><<< 11503 1726773091.80409: _low_level_execute_command() done: rc=0, stdout=, stderr= 11503 1726773091.80420: handler run complete 11503 1726773091.80427: attempt loop complete, returning result 11503 1726773091.80445: _execute() done 11503 1726773091.80449: dumping result to json 11503 1726773091.80457: done dumping result, returning 11503 1726773091.80475: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-0000000009ac] 11503 1726773091.80497: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ac 11503 1726773091.80560: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ac 11503 1726773091.80565: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8119 1726773091.81186: no more pending results, returning what we have 8119 1726773091.81194: results queue empty 8119 1726773091.81196: checking for any_errors_fatal 8119 1726773091.81207: done checking for any_errors_fatal 8119 1726773091.81209: checking for max_fail_percentage 8119 1726773091.81213: done checking for max_fail_percentage 8119 1726773091.81215: checking to see if all hosts have failed and the running result is not ok 8119 1726773091.81217: done checking to see if all hosts have failed 8119 1726773091.81220: getting the remaining hosts for this loop 8119 1726773091.81223: done getting the remaining hosts for this loop 8119 1726773091.81231: building list of next tasks for hosts 8119 1726773091.81234: getting the next task for host managed_node2 8119 1726773091.81242: done getting next task for host managed_node2 8119 1726773091.81246: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773091.81250: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773091.81253: done building task lists 8119 1726773091.81255: counting tasks in each state of execution 8119 1726773091.81259: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773091.81262: advancing hosts in ITERATING_TASKS 8119 1726773091.81264: starting to advance hosts 8119 1726773091.81267: getting the next task for host managed_node2 8119 1726773091.81271: done getting next task for host managed_node2 8119 1726773091.81274: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773091.81278: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773091.81280: done advancing hosts to next task 8119 1726773091.81297: getting variables 8119 1726773091.81302: in VariableManager get_vars() 8119 1726773091.81340: Calling all_inventory to load vars for managed_node2 8119 1726773091.81346: Calling groups_inventory to load vars for managed_node2 8119 1726773091.81350: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773091.81380: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81399: Calling all_plugins_play to load vars for managed_node2 8119 1726773091.81419: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81435: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773091.81454: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81465: Calling groups_plugins_play to load vars for managed_node2 8119 1726773091.81482: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81519: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81544: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773091.81863: done with get_vars() 8119 1726773091.81879: done getting variables 8119 1726773091.81890: sending task start callback, copying the task so we can template it temporarily 8119 1726773091.81893: done copying, going to template now 8119 1726773091.81897: done templating 8119 1726773091.81899: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.396) 0:01:26.375 **** 8119 1726773091.81925: sending task start callback 8119 1726773091.81929: entering _queue_task() for managed_node2/slurp 8119 1726773091.82110: worker is 1 (out of 1 available) 8119 1726773091.82146: exiting _queue_task() for managed_node2/slurp 8119 1726773091.82225: done queuing things up, now waiting for results queue to drain 8119 1726773091.82231: waiting for pending results... 11525 1726773091.82493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11525 1726773091.82561: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009ad 11525 1726773091.82619: calling self._execute() 11525 1726773091.84495: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11525 1726773091.84613: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11525 1726773091.84685: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11525 1726773091.84725: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11525 1726773091.84766: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11525 1726773091.84806: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11525 1726773091.84873: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11525 1726773091.84903: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11525 1726773091.84927: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11525 1726773091.85034: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11525 1726773091.85050: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11525 1726773091.85064: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11525 1726773091.85289: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11525 1726773091.85335: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11525 1726773091.85349: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11525 1726773091.85363: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11525 1726773091.85370: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11525 1726773091.85480: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11525 1726773091.85506: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11525 1726773091.85539: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11525 1726773091.85560: starting attempt loop 11525 1726773091.85564: running the handler 11525 1726773091.85575: _low_level_execute_command(): starting 11525 1726773091.85580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11525 1726773091.88178: stdout chunk (state=2): >>>/root <<< 11525 1726773091.88547: stderr chunk (state=3): >>><<< 11525 1726773091.88554: stdout chunk (state=3): >>><<< 11525 1726773091.88581: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11525 1726773091.88604: _low_level_execute_command(): starting 11525 1726773091.88612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721 `" && echo ansible-tmp-1726773091.8859644-11525-203253317476721="` echo /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721 `" ) && sleep 0' 11525 1726773091.91831: stdout chunk (state=2): >>>ansible-tmp-1726773091.8859644-11525-203253317476721=/root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721 <<< 11525 1726773091.92060: stderr chunk (state=3): >>><<< 11525 1726773091.92067: stdout chunk (state=3): >>><<< 11525 1726773091.92093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.8859644-11525-203253317476721=/root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721 , stderr= 11525 1726773091.92192: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/slurp-ZIP_DEFLATED 11525 1726773091.92261: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/AnsiballZ_slurp.py 11525 1726773091.93090: Sending initial data 11525 1726773091.93104: Sent initial data (153 bytes) 11525 1726773091.95496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmppdbkoz3l /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/AnsiballZ_slurp.py <<< 11525 1726773091.96577: stderr chunk (state=3): >>><<< 11525 1726773091.96589: stdout chunk (state=3): >>><<< 11525 1726773091.96620: done transferring module to remote 11525 1726773091.96640: _low_level_execute_command(): starting 11525 1726773091.96646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/ /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/AnsiballZ_slurp.py && sleep 0' 11525 1726773091.99539: stderr chunk (state=2): >>><<< 11525 1726773091.99554: stdout chunk (state=2): >>><<< 11525 1726773091.99577: _low_level_execute_command() done: rc=0, stdout=, stderr= 11525 1726773091.99580: _low_level_execute_command(): starting 11525 1726773091.99590: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/AnsiballZ_slurp.py && sleep 0' 11525 1726773092.14476: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11525 1726773092.15423: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11525 1726773092.15467: stderr chunk (state=3): >>><<< 11525 1726773092.15472: stdout chunk (state=3): >>><<< 11525 1726773092.15495: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 11525 1726773092.15534: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11525 1726773092.15550: _low_level_execute_command(): starting 11525 1726773092.15558: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.8859644-11525-203253317476721/ > /dev/null 2>&1 && sleep 0' 11525 1726773092.18214: stderr chunk (state=2): >>><<< 11525 1726773092.18226: stdout chunk (state=2): >>><<< 11525 1726773092.18247: _low_level_execute_command() done: rc=0, stdout=, stderr= 11525 1726773092.18254: handler run complete 11525 1726773092.18279: attempt loop complete, returning result 11525 1726773092.18294: _execute() done 11525 1726773092.18297: dumping result to json 11525 1726773092.18299: done dumping result, returning 11525 1726773092.18317: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [12a3200b-1e9d-1dbd-cc52-0000000009ad] 11525 1726773092.18331: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ad 11525 1726773092.18369: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ad 11525 1726773092.18374: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773092.18567: no more pending results, returning what we have 8119 1726773092.18572: results queue empty 8119 1726773092.18575: checking for any_errors_fatal 8119 1726773092.18581: done checking for any_errors_fatal 8119 1726773092.18586: checking for max_fail_percentage 8119 1726773092.18589: done checking for max_fail_percentage 8119 1726773092.18591: checking to see if all hosts have failed and the running result is not ok 8119 1726773092.18593: done checking to see if all hosts have failed 8119 1726773092.18595: getting the remaining hosts for this loop 8119 1726773092.18597: done getting the remaining hosts for this loop 8119 1726773092.18605: building list of next tasks for hosts 8119 1726773092.18610: getting the next task for host managed_node2 8119 1726773092.18617: done getting next task for host managed_node2 8119 1726773092.18622: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773092.18626: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.18628: done building task lists 8119 1726773092.18630: counting tasks in each state of execution 8119 1726773092.18634: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773092.18637: advancing hosts in ITERATING_TASKS 8119 1726773092.18639: starting to advance hosts 8119 1726773092.18642: getting the next task for host managed_node2 8119 1726773092.18645: done getting next task for host managed_node2 8119 1726773092.18648: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773092.18651: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.18653: done advancing hosts to next task 8119 1726773092.18668: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773092.18671: getting variables 8119 1726773092.18674: in VariableManager get_vars() 8119 1726773092.18706: Calling all_inventory to load vars for managed_node2 8119 1726773092.18713: Calling groups_inventory to load vars for managed_node2 8119 1726773092.18716: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773092.18739: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.18749: Calling all_plugins_play to load vars for managed_node2 8119 1726773092.18759: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.18768: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773092.18777: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.18786: Calling groups_plugins_play to load vars for managed_node2 8119 1726773092.18799: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.18825: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.18839: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.19079: done with get_vars() 8119 1726773092.19092: done getting variables 8119 1726773092.19097: sending task start callback, copying the task so we can template it temporarily 8119 1726773092.19099: done copying, going to template now 8119 1726773092.19101: done templating 8119 1726773092.19102: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.371) 0:01:26.747 **** 8119 1726773092.19121: sending task start callback 8119 1726773092.19123: entering _queue_task() for managed_node2/set_fact 8119 1726773092.19248: worker is 1 (out of 1 available) 8119 1726773092.19286: exiting _queue_task() for managed_node2/set_fact 8119 1726773092.19361: done queuing things up, now waiting for results queue to drain 8119 1726773092.19367: waiting for pending results... 11541 1726773092.19435: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11541 1726773092.19499: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009ae 11541 1726773092.19551: calling self._execute() 11541 1726773092.21707: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11541 1726773092.21792: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11541 1726773092.21850: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11541 1726773092.21877: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11541 1726773092.21907: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11541 1726773092.21944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11541 1726773092.21989: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11541 1726773092.22013: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11541 1726773092.22031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11541 1726773092.22124: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11541 1726773092.22143: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11541 1726773092.22161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11541 1726773092.22491: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11541 1726773092.22530: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11541 1726773092.22541: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11541 1726773092.22551: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11541 1726773092.22556: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11541 1726773092.22653: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11541 1726773092.22671: starting attempt loop 11541 1726773092.22673: running the handler 11541 1726773092.22687: handler run complete 11541 1726773092.22691: attempt loop complete, returning result 11541 1726773092.22692: _execute() done 11541 1726773092.22694: dumping result to json 11541 1726773092.22696: done dumping result, returning 11541 1726773092.22700: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [12a3200b-1e9d-1dbd-cc52-0000000009ae] 11541 1726773092.22707: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ae 11541 1726773092.22736: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009ae 11541 1726773092.22740: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8119 1726773092.22998: no more pending results, returning what we have 8119 1726773092.23003: results queue empty 8119 1726773092.23005: checking for any_errors_fatal 8119 1726773092.23014: done checking for any_errors_fatal 8119 1726773092.23016: checking for max_fail_percentage 8119 1726773092.23020: done checking for max_fail_percentage 8119 1726773092.23022: checking to see if all hosts have failed and the running result is not ok 8119 1726773092.23024: done checking to see if all hosts have failed 8119 1726773092.23025: getting the remaining hosts for this loop 8119 1726773092.23027: done getting the remaining hosts for this loop 8119 1726773092.23033: building list of next tasks for hosts 8119 1726773092.23035: getting the next task for host managed_node2 8119 1726773092.23041: done getting next task for host managed_node2 8119 1726773092.23043: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773092.23046: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.23048: done building task lists 8119 1726773092.23049: counting tasks in each state of execution 8119 1726773092.23052: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773092.23053: advancing hosts in ITERATING_TASKS 8119 1726773092.23055: starting to advance hosts 8119 1726773092.23056: getting the next task for host managed_node2 8119 1726773092.23058: done getting next task for host managed_node2 8119 1726773092.23060: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773092.23062: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.23064: done advancing hosts to next task 8119 1726773092.23075: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773092.23078: getting variables 8119 1726773092.23080: in VariableManager get_vars() 8119 1726773092.23121: Calling all_inventory to load vars for managed_node2 8119 1726773092.23126: Calling groups_inventory to load vars for managed_node2 8119 1726773092.23129: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773092.23151: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23161: Calling all_plugins_play to load vars for managed_node2 8119 1726773092.23171: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23180: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773092.23193: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23200: Calling groups_plugins_play to load vars for managed_node2 8119 1726773092.23212: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23236: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23251: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.23494: done with get_vars() 8119 1726773092.23506: done getting variables 8119 1726773092.23515: sending task start callback, copying the task so we can template it temporarily 8119 1726773092.23517: done copying, going to template now 8119 1726773092.23519: done templating 8119 1726773092.23520: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.044) 0:01:26.791 **** 8119 1726773092.23536: sending task start callback 8119 1726773092.23538: entering _queue_task() for managed_node2/copy 8119 1726773092.23691: worker is 1 (out of 1 available) 8119 1726773092.23731: exiting _queue_task() for managed_node2/copy 8119 1726773092.23805: done queuing things up, now waiting for results queue to drain 8119 1726773092.23813: waiting for pending results... 11544 1726773092.24045: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11544 1726773092.24111: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009af 11544 1726773092.24169: calling self._execute() 11544 1726773092.26345: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11544 1726773092.26436: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11544 1726773092.26495: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11544 1726773092.26526: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11544 1726773092.26555: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11544 1726773092.26588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11544 1726773092.26649: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11544 1726773092.26672: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11544 1726773092.26694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11544 1726773092.26776: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11544 1726773092.26797: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11544 1726773092.26819: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11544 1726773092.27149: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11544 1726773092.27199: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11544 1726773092.27214: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11544 1726773092.27225: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11544 1726773092.27230: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11544 1726773092.27360: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11544 1726773092.27385: starting attempt loop 11544 1726773092.27389: running the handler 11544 1726773092.27399: _low_level_execute_command(): starting 11544 1726773092.27404: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11544 1726773092.30205: stdout chunk (state=2): >>>/root <<< 11544 1726773092.30323: stderr chunk (state=3): >>><<< 11544 1726773092.30328: stdout chunk (state=3): >>><<< 11544 1726773092.30352: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11544 1726773092.30367: _low_level_execute_command(): starting 11544 1726773092.30372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341 `" && echo ansible-tmp-1726773092.303613-11544-272702479254341="` echo /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341 `" ) && sleep 0' 11544 1726773092.33115: stdout chunk (state=2): >>>ansible-tmp-1726773092.303613-11544-272702479254341=/root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341 <<< 11544 1726773092.33235: stderr chunk (state=3): >>><<< 11544 1726773092.33241: stdout chunk (state=3): >>><<< 11544 1726773092.33263: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.303613-11544-272702479254341=/root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341 , stderr= 11544 1726773092.33410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11544 1726773092.33466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_stat.py 11544 1726773092.33779: Sending initial data 11544 1726773092.33796: Sent initial data (151 bytes) 11544 1726773092.36266: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpwik2lxlv /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_stat.py <<< 11544 1726773092.37298: stderr chunk (state=3): >>><<< 11544 1726773092.37305: stdout chunk (state=3): >>><<< 11544 1726773092.37333: done transferring module to remote 11544 1726773092.37353: _low_level_execute_command(): starting 11544 1726773092.37358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/ /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_stat.py && sleep 0' 11544 1726773092.40006: stderr chunk (state=2): >>><<< 11544 1726773092.40020: stdout chunk (state=2): >>><<< 11544 1726773092.40040: _low_level_execute_command() done: rc=0, stdout=, stderr= 11544 1726773092.40043: _low_level_execute_command(): starting 11544 1726773092.40050: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_stat.py && sleep 0' 11544 1726773092.56197: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773092.1419022, "mtime": 1726773084.0249803, "ctime": 1726773084.0249803, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11544 1726773092.57354: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11544 1726773092.57365: stdout chunk (state=3): >>><<< 11544 1726773092.57377: stderr chunk (state=3): >>><<< 11544 1726773092.57399: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773092.1419022, "mtime": 1726773084.0249803, "ctime": 1726773084.0249803, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11544 1726773092.57486: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11544 1726773092.57611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11544 1726773092.57670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_file.py 11544 1726773092.58519: Sending initial data 11544 1726773092.58535: Sent initial data (151 bytes) 11544 1726773092.61526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpeve6oec7 /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_file.py <<< 11544 1726773092.62712: stderr chunk (state=3): >>><<< 11544 1726773092.62723: stdout chunk (state=3): >>><<< 11544 1726773092.62755: done transferring module to remote 11544 1726773092.62775: _low_level_execute_command(): starting 11544 1726773092.62784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/ /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_file.py && sleep 0' 11544 1726773092.65663: stderr chunk (state=2): >>><<< 11544 1726773092.65678: stdout chunk (state=2): >>><<< 11544 1726773092.65705: _low_level_execute_command() done: rc=0, stdout=, stderr= 11544 1726773092.65715: _low_level_execute_command(): starting 11544 1726773092.65724: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/AnsiballZ_file.py && sleep 0' 11544 1726773092.81856: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmphgj70pvb", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11544 1726773092.82914: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11544 1726773092.82962: stderr chunk (state=3): >>><<< 11544 1726773092.82968: stdout chunk (state=3): >>><<< 11544 1726773092.82991: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmphgj70pvb", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11544 1726773092.83059: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmphgj70pvb', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11544 1726773092.83074: _low_level_execute_command(): starting 11544 1726773092.83079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.303613-11544-272702479254341/ > /dev/null 2>&1 && sleep 0' 11544 1726773092.85742: stderr chunk (state=2): >>><<< 11544 1726773092.85755: stdout chunk (state=2): >>><<< 11544 1726773092.85774: _low_level_execute_command() done: rc=0, stdout=, stderr= 11544 1726773092.85785: handler run complete 11544 1726773092.85820: attempt loop complete, returning result 11544 1726773092.85834: _execute() done 11544 1726773092.85836: dumping result to json 11544 1726773092.85840: done dumping result, returning 11544 1726773092.85854: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [12a3200b-1e9d-1dbd-cc52-0000000009af] 11544 1726773092.85871: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009af 11544 1726773092.85911: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009af 11544 1726773092.85979: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8119 1726773092.86087: no more pending results, returning what we have 8119 1726773092.86095: results queue empty 8119 1726773092.86097: checking for any_errors_fatal 8119 1726773092.86101: done checking for any_errors_fatal 8119 1726773092.86102: checking for max_fail_percentage 8119 1726773092.86105: done checking for max_fail_percentage 8119 1726773092.86106: checking to see if all hosts have failed and the running result is not ok 8119 1726773092.86108: done checking to see if all hosts have failed 8119 1726773092.86110: getting the remaining hosts for this loop 8119 1726773092.86112: done getting the remaining hosts for this loop 8119 1726773092.86118: building list of next tasks for hosts 8119 1726773092.86122: getting the next task for host managed_node2 8119 1726773092.86129: done getting next task for host managed_node2 8119 1726773092.86132: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773092.86136: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.86137: done building task lists 8119 1726773092.86138: counting tasks in each state of execution 8119 1726773092.86142: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773092.86143: advancing hosts in ITERATING_TASKS 8119 1726773092.86145: starting to advance hosts 8119 1726773092.86147: getting the next task for host managed_node2 8119 1726773092.86149: done getting next task for host managed_node2 8119 1726773092.86151: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773092.86154: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773092.86155: done advancing hosts to next task 8119 1726773092.86166: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773092.86169: getting variables 8119 1726773092.86172: in VariableManager get_vars() 8119 1726773092.86208: Calling all_inventory to load vars for managed_node2 8119 1726773092.86215: Calling groups_inventory to load vars for managed_node2 8119 1726773092.86218: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773092.86247: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86261: Calling all_plugins_play to load vars for managed_node2 8119 1726773092.86275: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86290: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773092.86307: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86318: Calling groups_plugins_play to load vars for managed_node2 8119 1726773092.86333: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86360: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86380: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773092.86611: done with get_vars() 8119 1726773092.86621: done getting variables 8119 1726773092.86625: sending task start callback, copying the task so we can template it temporarily 8119 1726773092.86627: done copying, going to template now 8119 1726773092.86629: done templating 8119 1726773092.86630: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.631) 0:01:27.422 **** 8119 1726773092.86645: sending task start callback 8119 1726773092.86647: entering _queue_task() for managed_node2/copy 8119 1726773092.86776: worker is 1 (out of 1 available) 8119 1726773092.86816: exiting _queue_task() for managed_node2/copy 8119 1726773092.86890: done queuing things up, now waiting for results queue to drain 8119 1726773092.86895: waiting for pending results... 11575 1726773092.86959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11575 1726773092.87016: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b0 11575 1726773092.87062: calling self._execute() 11575 1726773092.88936: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11575 1726773092.89023: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11575 1726773092.89089: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11575 1726773092.89120: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11575 1726773092.89152: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11575 1726773092.89184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11575 1726773092.89231: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11575 1726773092.89257: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11575 1726773092.89275: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11575 1726773092.89357: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11575 1726773092.89376: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11575 1726773092.89394: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11575 1726773092.89620: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11575 1726773092.89657: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11575 1726773092.89668: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11575 1726773092.89678: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11575 1726773092.89686: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11575 1726773092.89782: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11575 1726773092.89807: starting attempt loop 11575 1726773092.89811: running the handler 11575 1726773092.89821: _low_level_execute_command(): starting 11575 1726773092.89826: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11575 1726773092.92301: stdout chunk (state=2): >>>/root <<< 11575 1726773092.92418: stderr chunk (state=3): >>><<< 11575 1726773092.92423: stdout chunk (state=3): >>><<< 11575 1726773092.92443: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11575 1726773092.92461: _low_level_execute_command(): starting 11575 1726773092.92469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037 `" && echo ansible-tmp-1726773092.9245489-11575-51322351347037="` echo /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037 `" ) && sleep 0' 11575 1726773092.95296: stdout chunk (state=2): >>>ansible-tmp-1726773092.9245489-11575-51322351347037=/root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037 <<< 11575 1726773092.95431: stderr chunk (state=3): >>><<< 11575 1726773092.95437: stdout chunk (state=3): >>><<< 11575 1726773092.95457: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.9245489-11575-51322351347037=/root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037 , stderr= 11575 1726773092.95607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11575 1726773092.95663: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_stat.py 11575 1726773092.96005: Sending initial data 11575 1726773092.96019: Sent initial data (151 bytes) 11575 1726773092.98516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmplbk8j23j /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_stat.py <<< 11575 1726773092.99524: stderr chunk (state=3): >>><<< 11575 1726773092.99533: stdout chunk (state=3): >>><<< 11575 1726773092.99559: done transferring module to remote 11575 1726773092.99574: _low_level_execute_command(): starting 11575 1726773092.99579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/ /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_stat.py && sleep 0' 11575 1726773093.02188: stderr chunk (state=2): >>><<< 11575 1726773093.02201: stdout chunk (state=2): >>><<< 11575 1726773093.02227: _low_level_execute_command() done: rc=0, stdout=, stderr= 11575 1726773093.02233: _low_level_execute_command(): starting 11575 1726773093.02240: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_stat.py && sleep 0' 11575 1726773093.18133: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773082.1430085, "mtime": 1726773084.0249803, "ctime": 1726773084.0249803, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11575 1726773093.19220: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11575 1726773093.19271: stderr chunk (state=3): >>><<< 11575 1726773093.19279: stdout chunk (state=3): >>><<< 11575 1726773093.19306: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773082.1430085, "mtime": 1726773084.0249803, "ctime": 1726773084.0249803, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11575 1726773093.19394: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11575 1726773093.19488: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11575 1726773093.19541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_file.py 11575 1726773093.19879: Sending initial data 11575 1726773093.19897: Sent initial data (151 bytes) 11575 1726773093.22418: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpqjvuf8o8 /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_file.py <<< 11575 1726773093.23465: stderr chunk (state=3): >>><<< 11575 1726773093.23470: stdout chunk (state=3): >>><<< 11575 1726773093.23493: done transferring module to remote 11575 1726773093.23507: _low_level_execute_command(): starting 11575 1726773093.23515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/ /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_file.py && sleep 0' 11575 1726773093.26057: stderr chunk (state=2): >>><<< 11575 1726773093.26068: stdout chunk (state=2): >>><<< 11575 1726773093.26094: _low_level_execute_command() done: rc=0, stdout=, stderr= 11575 1726773093.26099: _low_level_execute_command(): starting 11575 1726773093.26106: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/AnsiballZ_file.py && sleep 0' 11575 1726773093.41545: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpv90ap9f9", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11575 1726773093.42577: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11575 1726773093.42632: stderr chunk (state=3): >>><<< 11575 1726773093.42638: stdout chunk (state=3): >>><<< 11575 1726773093.42657: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpv90ap9f9", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11575 1726773093.42691: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpv90ap9f9', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11575 1726773093.42709: _low_level_execute_command(): starting 11575 1726773093.42720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.9245489-11575-51322351347037/ > /dev/null 2>&1 && sleep 0' 11575 1726773093.45400: stderr chunk (state=2): >>><<< 11575 1726773093.45412: stdout chunk (state=2): >>><<< 11575 1726773093.45434: _low_level_execute_command() done: rc=0, stdout=, stderr= 11575 1726773093.45448: handler run complete 11575 1726773093.45481: attempt loop complete, returning result 11575 1726773093.45497: _execute() done 11575 1726773093.45499: dumping result to json 11575 1726773093.45503: done dumping result, returning 11575 1726773093.45516: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [12a3200b-1e9d-1dbd-cc52-0000000009b0] 11575 1726773093.45530: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b0 11575 1726773093.45570: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b0 11575 1726773093.45574: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8119 1726773093.45784: no more pending results, returning what we have 8119 1726773093.45791: results queue empty 8119 1726773093.45793: checking for any_errors_fatal 8119 1726773093.45798: done checking for any_errors_fatal 8119 1726773093.45800: checking for max_fail_percentage 8119 1726773093.45803: done checking for max_fail_percentage 8119 1726773093.45805: checking to see if all hosts have failed and the running result is not ok 8119 1726773093.45807: done checking to see if all hosts have failed 8119 1726773093.45809: getting the remaining hosts for this loop 8119 1726773093.45812: done getting the remaining hosts for this loop 8119 1726773093.45819: building list of next tasks for hosts 8119 1726773093.45822: getting the next task for host managed_node2 8119 1726773093.45829: done getting next task for host managed_node2 8119 1726773093.45833: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773093.45837: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773093.45839: done building task lists 8119 1726773093.45841: counting tasks in each state of execution 8119 1726773093.45845: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773093.45847: advancing hosts in ITERATING_TASKS 8119 1726773093.45850: starting to advance hosts 8119 1726773093.45852: getting the next task for host managed_node2 8119 1726773093.45856: done getting next task for host managed_node2 8119 1726773093.45859: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773093.45861: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773093.45863: done advancing hosts to next task 8119 1726773093.45910: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773093.45917: getting variables 8119 1726773093.45920: in VariableManager get_vars() 8119 1726773093.45947: Calling all_inventory to load vars for managed_node2 8119 1726773093.45951: Calling groups_inventory to load vars for managed_node2 8119 1726773093.45953: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773093.45974: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.45987: Calling all_plugins_play to load vars for managed_node2 8119 1726773093.45999: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.46008: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773093.46024: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.46032: Calling groups_plugins_play to load vars for managed_node2 8119 1726773093.46042: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.46060: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.46073: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.46307: done with get_vars() 8119 1726773093.46319: done getting variables 8119 1726773093.46324: sending task start callback, copying the task so we can template it temporarily 8119 1726773093.46326: done copying, going to template now 8119 1726773093.46327: done templating 8119 1726773093.46329: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:33 -0400 (0:00:00.596) 0:01:28.019 **** 8119 1726773093.46344: sending task start callback 8119 1726773093.46346: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773093.46473: worker is 1 (out of 1 available) 8119 1726773093.46514: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773093.46588: done queuing things up, now waiting for results queue to drain 8119 1726773093.46594: waiting for pending results... 11588 1726773093.46656: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11588 1726773093.46711: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b1 11588 1726773093.46760: calling self._execute() 11588 1726773093.48592: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11588 1726773093.48680: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11588 1726773093.48739: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11588 1726773093.48769: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11588 1726773093.48801: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11588 1726773093.48835: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11588 1726773093.48880: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11588 1726773093.48906: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11588 1726773093.48930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11588 1726773093.49021: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11588 1726773093.49043: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11588 1726773093.49058: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11588 1726773093.49352: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11588 1726773093.49399: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11588 1726773093.49417: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11588 1726773093.49430: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11588 1726773093.49435: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11588 1726773093.49520: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11588 1726773093.49536: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11588 1726773093.49562: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11588 1726773093.49578: starting attempt loop 11588 1726773093.49580: running the handler 11588 1726773093.49590: _low_level_execute_command(): starting 11588 1726773093.49594: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11588 1726773093.52078: stdout chunk (state=2): >>>/root <<< 11588 1726773093.52197: stderr chunk (state=3): >>><<< 11588 1726773093.52202: stdout chunk (state=3): >>><<< 11588 1726773093.52225: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11588 1726773093.52239: _low_level_execute_command(): starting 11588 1726773093.52244: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623 `" && echo ansible-tmp-1726773093.5223336-11588-147120373667623="` echo /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623 `" ) && sleep 0' 11588 1726773093.54951: stdout chunk (state=2): >>>ansible-tmp-1726773093.5223336-11588-147120373667623=/root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623 <<< 11588 1726773093.55076: stderr chunk (state=3): >>><<< 11588 1726773093.55082: stdout chunk (state=3): >>><<< 11588 1726773093.55104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773093.5223336-11588-147120373667623=/root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623 , stderr= 11588 1726773093.55187: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 11588 1726773093.55244: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/AnsiballZ_kernel_settings_get_config.py 11588 1726773093.55542: Sending initial data 11588 1726773093.55558: Sent initial data (174 bytes) 11588 1726773093.58004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpcpq6sdbm /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/AnsiballZ_kernel_settings_get_config.py <<< 11588 1726773093.59020: stderr chunk (state=3): >>><<< 11588 1726773093.59027: stdout chunk (state=3): >>><<< 11588 1726773093.59050: done transferring module to remote 11588 1726773093.59064: _low_level_execute_command(): starting 11588 1726773093.59068: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/ /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11588 1726773093.61660: stderr chunk (state=2): >>><<< 11588 1726773093.61674: stdout chunk (state=2): >>><<< 11588 1726773093.61694: _low_level_execute_command() done: rc=0, stdout=, stderr= 11588 1726773093.61700: _low_level_execute_command(): starting 11588 1726773093.61707: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11588 1726773093.76695: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11588 1726773093.77675: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11588 1726773093.77726: stderr chunk (state=3): >>><<< 11588 1726773093.77735: stdout chunk (state=3): >>><<< 11588 1726773093.77757: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 11588 1726773093.77787: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11588 1726773093.77800: _low_level_execute_command(): starting 11588 1726773093.77805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773093.5223336-11588-147120373667623/ > /dev/null 2>&1 && sleep 0' 11588 1726773093.80476: stderr chunk (state=2): >>><<< 11588 1726773093.80489: stdout chunk (state=2): >>><<< 11588 1726773093.80507: _low_level_execute_command() done: rc=0, stdout=, stderr= 11588 1726773093.80514: handler run complete 11588 1726773093.80543: attempt loop complete, returning result 11588 1726773093.80561: _execute() done 11588 1726773093.80564: dumping result to json 11588 1726773093.80567: done dumping result, returning 11588 1726773093.80581: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [12a3200b-1e9d-1dbd-cc52-0000000009b1] 11588 1726773093.80594: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b1 11588 1726773093.80631: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b1 11588 1726773093.80635: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8119 1726773093.80911: no more pending results, returning what we have 8119 1726773093.80918: results queue empty 8119 1726773093.80920: checking for any_errors_fatal 8119 1726773093.80925: done checking for any_errors_fatal 8119 1726773093.80926: checking for max_fail_percentage 8119 1726773093.80928: done checking for max_fail_percentage 8119 1726773093.80929: checking to see if all hosts have failed and the running result is not ok 8119 1726773093.80931: done checking to see if all hosts have failed 8119 1726773093.80932: getting the remaining hosts for this loop 8119 1726773093.80934: done getting the remaining hosts for this loop 8119 1726773093.80940: building list of next tasks for hosts 8119 1726773093.80942: getting the next task for host managed_node2 8119 1726773093.80947: done getting next task for host managed_node2 8119 1726773093.80950: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773093.80953: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773093.80954: done building task lists 8119 1726773093.80956: counting tasks in each state of execution 8119 1726773093.80958: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773093.80960: advancing hosts in ITERATING_TASKS 8119 1726773093.80961: starting to advance hosts 8119 1726773093.80963: getting the next task for host managed_node2 8119 1726773093.80965: done getting next task for host managed_node2 8119 1726773093.80967: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773093.80969: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773093.80970: done advancing hosts to next task 8119 1726773093.80981: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773093.80986: getting variables 8119 1726773093.80989: in VariableManager get_vars() 8119 1726773093.81019: Calling all_inventory to load vars for managed_node2 8119 1726773093.81023: Calling groups_inventory to load vars for managed_node2 8119 1726773093.81027: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773093.81051: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81061: Calling all_plugins_play to load vars for managed_node2 8119 1726773093.81071: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81080: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773093.81093: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81100: Calling groups_plugins_play to load vars for managed_node2 8119 1726773093.81112: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81132: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81150: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773093.81359: done with get_vars() 8119 1726773093.81371: done getting variables 8119 1726773093.81378: sending task start callback, copying the task so we can template it temporarily 8119 1726773093.81380: done copying, going to template now 8119 1726773093.81381: done templating 8119 1726773093.81385: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:33 -0400 (0:00:00.350) 0:01:28.370 **** 8119 1726773093.81401: sending task start callback 8119 1726773093.81403: entering _queue_task() for managed_node2/template 8119 1726773093.81526: worker is 1 (out of 1 available) 8119 1726773093.81565: exiting _queue_task() for managed_node2/template 8119 1726773093.81641: done queuing things up, now waiting for results queue to drain 8119 1726773093.81646: waiting for pending results... 11600 1726773093.81707: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11600 1726773093.81756: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b2 11600 1726773093.81804: calling self._execute() 11600 1726773093.83584: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11600 1726773093.83668: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11600 1726773093.83724: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11600 1726773093.83755: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11600 1726773093.83782: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11600 1726773093.83817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11600 1726773093.83874: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11600 1726773093.83900: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11600 1726773093.83921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11600 1726773093.84000: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11600 1726773093.84018: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11600 1726773093.84037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11600 1726773093.84430: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11600 1726773093.84465: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11600 1726773093.84476: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11600 1726773093.84491: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11600 1726773093.84502: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11600 1726773093.84600: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11600 1726773093.84619: starting attempt loop 11600 1726773093.84622: running the handler 11600 1726773093.84629: _low_level_execute_command(): starting 11600 1726773093.84633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11600 1726773093.87131: stdout chunk (state=2): >>>/root <<< 11600 1726773093.87247: stderr chunk (state=3): >>><<< 11600 1726773093.87253: stdout chunk (state=3): >>><<< 11600 1726773093.87277: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11600 1726773093.87295: _low_level_execute_command(): starting 11600 1726773093.87301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548 `" && echo ansible-tmp-1726773093.8728857-11600-67853851188548="` echo /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548 `" ) && sleep 0' 11600 1726773093.90149: stdout chunk (state=2): >>>ansible-tmp-1726773093.8728857-11600-67853851188548=/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548 <<< 11600 1726773093.90275: stderr chunk (state=3): >>><<< 11600 1726773093.90281: stdout chunk (state=3): >>><<< 11600 1726773093.90304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773093.8728857-11600-67853851188548=/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548 , stderr= 11600 1726773093.90330: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11600 1726773093.90353: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11600 1726773093.91841: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.91848: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.91851: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.91854: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.91857: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.91861: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.91863: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.91865: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.91867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.91889: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.91894: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.91896: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92145: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92150: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.92152: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.92154: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.92156: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.92158: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92159: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.92161: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.92163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.92180: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92186: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.92188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92216: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92219: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.92221: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.92223: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.92225: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.92227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92229: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.92231: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.92232: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.92245: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92248: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.92250: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92398: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92404: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.92406: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.92409: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.92411: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.92413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92415: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.92416: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.92418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.92436: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92439: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.92441: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92676: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92682: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.92687: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.92690: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.92692: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.92694: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92696: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.92698: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.92700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.92716: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92719: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.92721: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92744: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92749: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11600 1726773093.92751: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11600 1726773093.92753: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11600 1726773093.92755: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11600 1726773093.92756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.92758: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11600 1726773093.92760: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11600 1726773093.92761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11600 1726773093.92778: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11600 1726773093.92781: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11600 1726773093.92784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11600 1726773093.93979: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11600 1726773093.94060: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11600 1726773093.94111: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_stat.py 11600 1726773093.94468: Sending initial data 11600 1726773093.94483: Sent initial data (151 bytes) 11600 1726773093.96969: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp0l2kt0ku /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_stat.py <<< 11600 1726773093.97988: stderr chunk (state=3): >>><<< 11600 1726773093.97996: stdout chunk (state=3): >>><<< 11600 1726773093.98023: done transferring module to remote 11600 1726773093.98039: _low_level_execute_command(): starting 11600 1726773093.98043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/ /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_stat.py && sleep 0' 11600 1726773094.00672: stderr chunk (state=2): >>><<< 11600 1726773094.00686: stdout chunk (state=2): >>><<< 11600 1726773094.00708: _low_level_execute_command() done: rc=0, stdout=, stderr= 11600 1726773094.00714: _low_level_execute_command(): starting 11600 1726773094.00723: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_stat.py && sleep 0' 11600 1726773094.16301: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189506, "dev": 51713, "nlink": 1, "atime": 1726773084.0119805, "mtime": 1726773083.2079926, "ctime": 1726773083.4549887, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "3098423658", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11600 1726773094.17796: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11600 1726773094.17804: stdout chunk (state=3): >>><<< 11600 1726773094.17814: stderr chunk (state=3): >>><<< 11600 1726773094.17829: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189506, "dev": 51713, "nlink": 1, "atime": 1726773084.0119805, "mtime": 1726773083.2079926, "ctime": 1726773083.4549887, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "3098423658", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11600 1726773094.17896: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11600 1726773094.18253: Sending initial data 11600 1726773094.18268: Sent initial data (159 bytes) 11600 1726773094.21235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpnsfb48_4/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source <<< 11600 1726773094.21572: stderr chunk (state=3): >>><<< 11600 1726773094.21577: stdout chunk (state=3): >>><<< 11600 1726773094.21602: _low_level_execute_command(): starting 11600 1726773094.21608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/ /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source && sleep 0' 11600 1726773094.24243: stderr chunk (state=2): >>><<< 11600 1726773094.24256: stdout chunk (state=2): >>><<< 11600 1726773094.24281: _low_level_execute_command() done: rc=0, stdout=, stderr= 11600 1726773094.24423: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 11600 1726773094.24477: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_copy.py 11600 1726773094.25219: Sending initial data 11600 1726773094.25233: Sent initial data (151 bytes) 11600 1726773094.27506: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpydb9mhyv /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_copy.py <<< 11600 1726773094.28670: stderr chunk (state=3): >>><<< 11600 1726773094.28677: stdout chunk (state=3): >>><<< 11600 1726773094.28701: done transferring module to remote 11600 1726773094.28719: _low_level_execute_command(): starting 11600 1726773094.28724: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/ /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_copy.py && sleep 0' 11600 1726773094.31235: stderr chunk (state=2): >>><<< 11600 1726773094.31246: stdout chunk (state=2): >>><<< 11600 1726773094.31267: _low_level_execute_command() done: rc=0, stdout=, stderr= 11600 1726773094.31272: _low_level_execute_command(): starting 11600 1726773094.31278: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/AnsiballZ_copy.py && sleep 0' 11600 1726773094.46835: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 11600 1726773094.47851: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11600 1726773094.47902: stderr chunk (state=3): >>><<< 11600 1726773094.47908: stdout chunk (state=3): >>><<< 11600 1726773094.47932: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11600 1726773094.47963: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '0b586509c0bdce12a2dde058e3374dab88cf7f2c', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11600 1726773094.47997: _low_level_execute_command(): starting 11600 1726773094.48006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/ > /dev/null 2>&1 && sleep 0' 11600 1726773094.50654: stderr chunk (state=2): >>><<< 11600 1726773094.50665: stdout chunk (state=2): >>><<< 11600 1726773094.50686: _low_level_execute_command() done: rc=0, stdout=, stderr= 11600 1726773094.50715: handler run complete 11600 1726773094.50750: attempt loop complete, returning result 11600 1726773094.50755: _execute() done 11600 1726773094.50757: dumping result to json 11600 1726773094.50761: done dumping result, returning 11600 1726773094.50774: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [12a3200b-1e9d-1dbd-cc52-0000000009b2] 11600 1726773094.50788: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b2 11600 1726773094.50827: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b2 11600 1726773094.50832: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "394928e588644c456053f3dec5f7c2ba", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "src": "/root/.ansible/tmp/ansible-tmp-1726773093.8728857-11600-67853851188548/source", "state": "file", "uid": 0 } 8119 1726773094.51074: no more pending results, returning what we have 8119 1726773094.51080: results queue empty 8119 1726773094.51082: checking for any_errors_fatal 8119 1726773094.51089: done checking for any_errors_fatal 8119 1726773094.51091: checking for max_fail_percentage 8119 1726773094.51094: done checking for max_fail_percentage 8119 1726773094.51096: checking to see if all hosts have failed and the running result is not ok 8119 1726773094.51099: done checking to see if all hosts have failed 8119 1726773094.51101: getting the remaining hosts for this loop 8119 1726773094.51103: done getting the remaining hosts for this loop 8119 1726773094.51110: building list of next tasks for hosts 8119 1726773094.51113: getting the next task for host managed_node2 8119 1726773094.51121: done getting next task for host managed_node2 8119 1726773094.51125: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773094.51129: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773094.51132: done building task lists 8119 1726773094.51134: counting tasks in each state of execution 8119 1726773094.51137: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773094.51139: advancing hosts in ITERATING_TASKS 8119 1726773094.51141: starting to advance hosts 8119 1726773094.51144: getting the next task for host managed_node2 8119 1726773094.51147: done getting next task for host managed_node2 8119 1726773094.51150: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773094.51153: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773094.51155: done advancing hosts to next task 8119 1726773094.51170: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773094.51174: getting variables 8119 1726773094.51177: in VariableManager get_vars() 8119 1726773094.51213: Calling all_inventory to load vars for managed_node2 8119 1726773094.51221: Calling groups_inventory to load vars for managed_node2 8119 1726773094.51225: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773094.51248: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51259: Calling all_plugins_play to load vars for managed_node2 8119 1726773094.51269: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51278: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773094.51290: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51298: Calling groups_plugins_play to load vars for managed_node2 8119 1726773094.51307: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51327: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51346: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.51552: done with get_vars() 8119 1726773094.51565: done getting variables 8119 1726773094.51571: sending task start callback, copying the task so we can template it temporarily 8119 1726773094.51572: done copying, going to template now 8119 1726773094.51574: done templating 8119 1726773094.51576: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:34 -0400 (0:00:00.701) 0:01:29.072 **** 8119 1726773094.51594: sending task start callback 8119 1726773094.51596: entering _queue_task() for managed_node2/service 8119 1726773094.51723: worker is 1 (out of 1 available) 8119 1726773094.51762: exiting _queue_task() for managed_node2/service 8119 1726773094.51836: done queuing things up, now waiting for results queue to drain 8119 1726773094.51841: waiting for pending results... 11627 1726773094.51920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11627 1726773094.51976: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b3 11627 1726773094.54048: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11627 1726773094.54145: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11627 1726773094.54201: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11627 1726773094.54242: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11627 1726773094.54270: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11627 1726773094.54302: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11627 1726773094.54369: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11627 1726773094.54402: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11627 1726773094.54425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11627 1726773094.54516: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11627 1726773094.54535: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11627 1726773094.54549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11627 1726773094.54711: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11627 1726773094.54716: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11627 1726773094.54718: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11627 1726773094.54720: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11627 1726773094.54722: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11627 1726773094.54724: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.54726: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11627 1726773094.54728: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11627 1726773094.54730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11627 1726773094.54748: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11627 1726773094.54751: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11627 1726773094.54753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.54908: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11627 1726773094.54914: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11627 1726773094.54918: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11627 1726773094.54921: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11627 1726773094.54923: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11627 1726773094.54925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.54926: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11627 1726773094.54928: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11627 1726773094.54930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11627 1726773094.54948: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11627 1726773094.54951: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11627 1726773094.54953: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.55171: when evaluation is False, skipping this task 11627 1726773094.55211: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11627 1726773094.55216: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11627 1726773094.55218: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11627 1726773094.55220: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11627 1726773094.55222: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11627 1726773094.55224: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.55226: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11627 1726773094.55228: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11627 1726773094.55229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11627 1726773094.55246: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11627 1726773094.55249: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11627 1726773094.55251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11627 1726773094.55329: dumping result to json 11627 1726773094.55357: done dumping result, returning 11627 1726773094.55364: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [12a3200b-1e9d-1dbd-cc52-0000000009b3] 11627 1726773094.55375: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b3 11627 1726773094.55380: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b3 11627 1726773094.55382: WORKER PROCESS EXITING skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "item": "tuned", "skip_reason": "Conditional result was False" } 8119 1726773094.55544: no more pending results, returning what we have 8119 1726773094.55550: results queue empty 8119 1726773094.55552: checking for any_errors_fatal 8119 1726773094.55562: done checking for any_errors_fatal 8119 1726773094.55564: checking for max_fail_percentage 8119 1726773094.55567: done checking for max_fail_percentage 8119 1726773094.55569: checking to see if all hosts have failed and the running result is not ok 8119 1726773094.55571: done checking to see if all hosts have failed 8119 1726773094.55573: getting the remaining hosts for this loop 8119 1726773094.55575: done getting the remaining hosts for this loop 8119 1726773094.55582: building list of next tasks for hosts 8119 1726773094.55587: getting the next task for host managed_node2 8119 1726773094.55594: done getting next task for host managed_node2 8119 1726773094.55599: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773094.55603: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773094.55605: done building task lists 8119 1726773094.55607: counting tasks in each state of execution 8119 1726773094.55611: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773094.55613: advancing hosts in ITERATING_TASKS 8119 1726773094.55615: starting to advance hosts 8119 1726773094.55618: getting the next task for host managed_node2 8119 1726773094.55622: done getting next task for host managed_node2 8119 1726773094.55625: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773094.55628: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773094.55630: done advancing hosts to next task 8119 1726773094.55645: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773094.55649: getting variables 8119 1726773094.55652: in VariableManager get_vars() 8119 1726773094.55684: Calling all_inventory to load vars for managed_node2 8119 1726773094.55691: Calling groups_inventory to load vars for managed_node2 8119 1726773094.55694: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773094.55717: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.55728: Calling all_plugins_play to load vars for managed_node2 8119 1726773094.55738: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.55747: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773094.55757: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.55763: Calling groups_plugins_play to load vars for managed_node2 8119 1726773094.55772: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.55793: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.55813: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773094.56055: done with get_vars() 8119 1726773094.56068: done getting variables 8119 1726773094.56074: sending task start callback, copying the task so we can template it temporarily 8119 1726773094.56075: done copying, going to template now 8119 1726773094.56077: done templating 8119 1726773094.56079: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:34 -0400 (0:00:00.045) 0:01:29.117 **** 8119 1726773094.56102: sending task start callback 8119 1726773094.56105: entering _queue_task() for managed_node2/command 8119 1726773094.56269: worker is 1 (out of 1 available) 8119 1726773094.56311: exiting _queue_task() for managed_node2/command 8119 1726773094.56386: done queuing things up, now waiting for results queue to drain 8119 1726773094.56392: waiting for pending results... 11630 1726773094.56603: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11630 1726773094.56669: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b4 11630 1726773094.56725: calling self._execute() 11630 1726773094.59153: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11630 1726773094.59258: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11630 1726773094.59325: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11630 1726773094.59361: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11630 1726773094.59397: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11630 1726773094.59445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11630 1726773094.59503: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11630 1726773094.59534: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11630 1726773094.59557: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11630 1726773094.59656: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11630 1726773094.59679: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11630 1726773094.59720: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11630 1726773094.60673: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11630 1726773094.60727: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11630 1726773094.60743: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11630 1726773094.60760: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11630 1726773094.60768: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11630 1726773094.60923: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11630 1726773094.60944: starting attempt loop 11630 1726773094.60948: running the handler 11630 1726773094.60960: _low_level_execute_command(): starting 11630 1726773094.60964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11630 1726773094.64093: stdout chunk (state=2): >>>/root <<< 11630 1726773094.64155: stderr chunk (state=3): >>><<< 11630 1726773094.64161: stdout chunk (state=3): >>><<< 11630 1726773094.64188: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11630 1726773094.64205: _low_level_execute_command(): starting 11630 1726773094.64211: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776 `" && echo ansible-tmp-1726773094.641981-11630-79754030346776="` echo /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776 `" ) && sleep 0' 11630 1726773094.67469: stdout chunk (state=2): >>>ansible-tmp-1726773094.641981-11630-79754030346776=/root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776 <<< 11630 1726773094.67624: stderr chunk (state=3): >>><<< 11630 1726773094.67634: stdout chunk (state=3): >>><<< 11630 1726773094.67662: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773094.641981-11630-79754030346776=/root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776 , stderr= 11630 1726773094.67819: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11630 1726773094.67891: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/AnsiballZ_command.py 11630 1726773094.68585: Sending initial data 11630 1726773094.68599: Sent initial data (153 bytes) 11630 1726773094.71293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpe7npad54 /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/AnsiballZ_command.py <<< 11630 1726773094.72639: stderr chunk (state=3): >>><<< 11630 1726773094.72647: stdout chunk (state=3): >>><<< 11630 1726773094.72679: done transferring module to remote 11630 1726773094.72699: _low_level_execute_command(): starting 11630 1726773094.72705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/ /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/AnsiballZ_command.py && sleep 0' 11630 1726773094.75562: stderr chunk (state=2): >>><<< 11630 1726773094.75578: stdout chunk (state=2): >>><<< 11630 1726773094.75607: _low_level_execute_command() done: rc=0, stdout=, stderr= 11630 1726773094.75615: _low_level_execute_command(): starting 11630 1726773094.75624: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/AnsiballZ_command.py && sleep 0' 11630 1726773096.04253: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:34.906815", "end": "2024-09-19 15:11:36.040403", "delta": "0:00:01.133588", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11630 1726773096.05427: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11630 1726773096.05472: stderr chunk (state=3): >>><<< 11630 1726773096.05478: stdout chunk (state=3): >>><<< 11630 1726773096.05505: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:34.906815", "end": "2024-09-19 15:11:36.040403", "delta": "0:00:01.133588", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11630 1726773096.05540: done with _execute_module (command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11630 1726773096.05552: _low_level_execute_command(): starting 11630 1726773096.05557: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773094.641981-11630-79754030346776/ > /dev/null 2>&1 && sleep 0' 11630 1726773096.08254: stderr chunk (state=2): >>><<< 11630 1726773096.08266: stdout chunk (state=2): >>><<< 11630 1726773096.08287: _low_level_execute_command() done: rc=0, stdout=, stderr= 11630 1726773096.08294: handler run complete 11630 1726773096.08305: attempt loop complete, returning result 11630 1726773096.08321: _execute() done 11630 1726773096.08323: dumping result to json 11630 1726773096.08327: done dumping result, returning 11630 1726773096.08343: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [12a3200b-1e9d-1dbd-cc52-0000000009b4] 11630 1726773096.08358: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b4 11630 1726773096.08396: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b4 11630 1726773096.08441: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.133588", "end": "2024-09-19 15:11:36.040403", "rc": 0, "start": "2024-09-19 15:11:34.906815" } 8119 1726773096.08642: no more pending results, returning what we have 8119 1726773096.08647: results queue empty 8119 1726773096.08649: checking for any_errors_fatal 8119 1726773096.08654: done checking for any_errors_fatal 8119 1726773096.08656: checking for max_fail_percentage 8119 1726773096.08659: done checking for max_fail_percentage 8119 1726773096.08661: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.08663: done checking to see if all hosts have failed 8119 1726773096.08665: getting the remaining hosts for this loop 8119 1726773096.08668: done getting the remaining hosts for this loop 8119 1726773096.08675: building list of next tasks for hosts 8119 1726773096.08677: getting the next task for host managed_node2 8119 1726773096.08686: done getting next task for host managed_node2 8119 1726773096.08691: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773096.08695: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.08697: done building task lists 8119 1726773096.08699: counting tasks in each state of execution 8119 1726773096.08702: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.08703: advancing hosts in ITERATING_TASKS 8119 1726773096.08705: starting to advance hosts 8119 1726773096.08706: getting the next task for host managed_node2 8119 1726773096.08711: done getting next task for host managed_node2 8119 1726773096.08713: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773096.08716: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.08717: done advancing hosts to next task 8119 1726773096.08729: getting variables 8119 1726773096.08731: in VariableManager get_vars() 8119 1726773096.08758: Calling all_inventory to load vars for managed_node2 8119 1726773096.08761: Calling groups_inventory to load vars for managed_node2 8119 1726773096.08763: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.08787: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.08802: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.08817: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.08826: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.08837: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.08843: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.08852: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.08870: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.08885: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.09122: done with get_vars() 8119 1726773096.09135: done getting variables 8119 1726773096.09141: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.09142: done copying, going to template now 8119 1726773096.09144: done templating 8119 1726773096.09146: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:36 -0400 (0:00:01.530) 0:01:30.648 **** 8119 1726773096.09161: sending task start callback 8119 1726773096.09163: entering _queue_task() for managed_node2/include_tasks 8119 1726773096.09297: worker is 1 (out of 1 available) 8119 1726773096.09338: exiting _queue_task() for managed_node2/include_tasks 8119 1726773096.09414: done queuing things up, now waiting for results queue to drain 8119 1726773096.09420: waiting for pending results... 11688 1726773096.09471: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11688 1726773096.09530: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b5 11688 1726773096.09575: calling self._execute() 11688 1726773096.11718: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11688 1726773096.11837: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11688 1726773096.11906: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11688 1726773096.11942: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11688 1726773096.11969: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11688 1726773096.11999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11688 1726773096.12054: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11688 1726773096.12084: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11688 1726773096.12102: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11688 1726773096.12197: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11688 1726773096.12215: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11688 1726773096.12230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11688 1726773096.12489: _execute() done 11688 1726773096.12494: dumping result to json 11688 1726773096.12497: done dumping result, returning 11688 1726773096.12501: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [12a3200b-1e9d-1dbd-cc52-0000000009b5] 11688 1726773096.12513: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b5 11688 1726773096.12541: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b5 11688 1726773096.12544: WORKER PROCESS EXITING 8119 1726773096.12696: no more pending results, returning what we have 8119 1726773096.12706: in VariableManager get_vars() 8119 1726773096.12754: Calling all_inventory to load vars for managed_node2 8119 1726773096.12761: Calling groups_inventory to load vars for managed_node2 8119 1726773096.12765: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.12831: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.12845: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.12856: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.12865: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.12875: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.12881: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.12898: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.12922: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.12937: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.13149: done with get_vars() 8119 1726773096.13191: we have included files to process 8119 1726773096.13194: generating all_blocks data 8119 1726773096.13197: done generating all_blocks data 8119 1726773096.13201: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773096.13203: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773096.13205: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8119 1726773096.13427: done processing included file 8119 1726773096.13429: iterating over new_blocks loaded from include file 8119 1726773096.13431: in VariableManager get_vars() 8119 1726773096.13452: done with get_vars() 8119 1726773096.13454: filtering new block on tags 8119 1726773096.13515: done filtering new block on tags 8119 1726773096.13526: done iterating over new_blocks loaded from include file 8119 1726773096.13528: extending task lists for all hosts with included blocks 8119 1726773096.13904: done extending task lists 8119 1726773096.13910: done processing included files 8119 1726773096.13911: results queue empty 8119 1726773096.13913: checking for any_errors_fatal 8119 1726773096.13923: done checking for any_errors_fatal 8119 1726773096.13926: checking for max_fail_percentage 8119 1726773096.13928: done checking for max_fail_percentage 8119 1726773096.13930: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.13933: done checking to see if all hosts have failed 8119 1726773096.13935: getting the remaining hosts for this loop 8119 1726773096.13939: done getting the remaining hosts for this loop 8119 1726773096.13944: building list of next tasks for hosts 8119 1726773096.13946: getting the next task for host managed_node2 8119 1726773096.13951: done getting next task for host managed_node2 8119 1726773096.13953: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773096.13956: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.13958: done building task lists 8119 1726773096.13959: counting tasks in each state of execution 8119 1726773096.13962: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.13963: advancing hosts in ITERATING_TASKS 8119 1726773096.13964: starting to advance hosts 8119 1726773096.13966: getting the next task for host managed_node2 8119 1726773096.13969: done getting next task for host managed_node2 8119 1726773096.13971: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773096.13974: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.13975: done advancing hosts to next task 8119 1726773096.13981: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.13987: getting variables 8119 1726773096.13992: in VariableManager get_vars() 8119 1726773096.14013: Calling all_inventory to load vars for managed_node2 8119 1726773096.14017: Calling groups_inventory to load vars for managed_node2 8119 1726773096.14019: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.14034: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14043: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.14065: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14077: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.14090: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14098: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.14113: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14140: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14159: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.14355: done with get_vars() 8119 1726773096.14365: done getting variables 8119 1726773096.14370: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.14372: done copying, going to template now 8119 1726773096.14374: done templating 8119 1726773096.14376: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.052) 0:01:30.700 **** 8119 1726773096.14400: sending task start callback 8119 1726773096.14403: entering _queue_task() for managed_node2/command 8119 1726773096.14585: worker is 1 (out of 1 available) 8119 1726773096.14624: exiting _queue_task() for managed_node2/command 8119 1726773096.14696: done queuing things up, now waiting for results queue to drain 8119 1726773096.14701: waiting for pending results... 11692 1726773096.14924: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11692 1726773096.14998: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000be3 11692 1726773096.15056: calling self._execute() 11692 1726773096.15302: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11692 1726773096.15363: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11692 1726773096.15378: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11692 1726773096.15396: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11692 1726773096.15406: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11692 1726773096.15581: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11692 1726773096.15611: starting attempt loop 11692 1726773096.15614: running the handler 11692 1726773096.15633: _low_level_execute_command(): starting 11692 1726773096.15640: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11692 1726773096.18182: stdout chunk (state=2): >>>/root <<< 11692 1726773096.18320: stderr chunk (state=3): >>><<< 11692 1726773096.18327: stdout chunk (state=3): >>><<< 11692 1726773096.18356: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11692 1726773096.18379: _low_level_execute_command(): starting 11692 1726773096.18392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863 `" && echo ansible-tmp-1726773096.1836977-11692-33509580450863="` echo /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863 `" ) && sleep 0' 11692 1726773096.21600: stdout chunk (state=2): >>>ansible-tmp-1726773096.1836977-11692-33509580450863=/root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863 <<< 11692 1726773096.21652: stderr chunk (state=3): >>><<< 11692 1726773096.21659: stdout chunk (state=3): >>><<< 11692 1726773096.21679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773096.1836977-11692-33509580450863=/root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863 , stderr= 11692 1726773096.21835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11692 1726773096.21903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/AnsiballZ_command.py 11692 1726773096.22624: Sending initial data 11692 1726773096.22638: Sent initial data (154 bytes) 11692 1726773096.25030: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpedseqfp0 /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/AnsiballZ_command.py <<< 11692 1726773096.26206: stderr chunk (state=3): >>><<< 11692 1726773096.26214: stdout chunk (state=3): >>><<< 11692 1726773096.26240: done transferring module to remote 11692 1726773096.26254: _low_level_execute_command(): starting 11692 1726773096.26258: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/ /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/AnsiballZ_command.py && sleep 0' 11692 1726773096.28780: stderr chunk (state=2): >>><<< 11692 1726773096.28792: stdout chunk (state=2): >>><<< 11692 1726773096.28814: _low_level_execute_command() done: rc=0, stdout=, stderr= 11692 1726773096.28820: _low_level_execute_command(): starting 11692 1726773096.28827: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/AnsiballZ_command.py && sleep 0' 11692 1726773096.53685: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:36.430222", "end": "2024-09-19 15:11:36.534763", "delta": "0:00:00.104541", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11692 1726773096.54902: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11692 1726773096.54915: stdout chunk (state=3): >>><<< 11692 1726773096.54928: stderr chunk (state=3): >>><<< 11692 1726773096.54950: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:36.430222", "end": "2024-09-19 15:11:36.534763", "delta": "0:00:00.104541", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11692 1726773096.55001: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11692 1726773096.55019: _low_level_execute_command(): starting 11692 1726773096.55026: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773096.1836977-11692-33509580450863/ > /dev/null 2>&1 && sleep 0' 11692 1726773096.57830: stderr chunk (state=2): >>><<< 11692 1726773096.57843: stdout chunk (state=2): >>><<< 11692 1726773096.57863: _low_level_execute_command() done: rc=0, stdout=, stderr= 11692 1726773096.57870: handler run complete 11692 1726773096.57927: attempt loop complete, returning result 11692 1726773096.57947: _execute() done 11692 1726773096.57950: dumping result to json 11692 1726773096.57955: done dumping result, returning 11692 1726773096.57971: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [12a3200b-1e9d-1dbd-cc52-000000000be3] 11692 1726773096.57988: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be3 11692 1726773096.58057: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be3 11692 1726773096.58062: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.104541", "end": "2024-09-19 15:11:36.534763", "rc": 0, "start": "2024-09-19 15:11:36.430222" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773096.58547: no more pending results, returning what we have 8119 1726773096.58552: results queue empty 8119 1726773096.58553: checking for any_errors_fatal 8119 1726773096.58556: done checking for any_errors_fatal 8119 1726773096.58557: checking for max_fail_percentage 8119 1726773096.58559: done checking for max_fail_percentage 8119 1726773096.58560: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.58562: done checking to see if all hosts have failed 8119 1726773096.58563: getting the remaining hosts for this loop 8119 1726773096.58565: done getting the remaining hosts for this loop 8119 1726773096.58576: building list of next tasks for hosts 8119 1726773096.58580: getting the next task for host managed_node2 8119 1726773096.58591: done getting next task for host managed_node2 8119 1726773096.58595: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773096.58599: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.58601: done building task lists 8119 1726773096.58602: counting tasks in each state of execution 8119 1726773096.58605: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.58606: advancing hosts in ITERATING_TASKS 8119 1726773096.58608: starting to advance hosts 8119 1726773096.58610: getting the next task for host managed_node2 8119 1726773096.58613: done getting next task for host managed_node2 8119 1726773096.58614: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773096.58617: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.58619: done advancing hosts to next task 8119 1726773096.58630: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.58634: getting variables 8119 1726773096.58636: in VariableManager get_vars() 8119 1726773096.58673: Calling all_inventory to load vars for managed_node2 8119 1726773096.58678: Calling groups_inventory to load vars for managed_node2 8119 1726773096.58680: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.58708: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.58721: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.58732: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.58741: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.58751: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.58760: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.58772: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.58794: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.58809: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.59069: done with get_vars() 8119 1726773096.59080: done getting variables 8119 1726773096.59087: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.59090: done copying, going to template now 8119 1726773096.59092: done templating 8119 1726773096.59093: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.447) 0:01:31.147 **** 8119 1726773096.59110: sending task start callback 8119 1726773096.59112: entering _queue_task() for managed_node2/shell 8119 1726773096.59250: worker is 1 (out of 1 available) 8119 1726773096.59291: exiting _queue_task() for managed_node2/shell 8119 1726773096.59363: done queuing things up, now waiting for results queue to drain 8119 1726773096.59368: waiting for pending results... 11713 1726773096.59450: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11713 1726773096.59535: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000be4 11713 1726773096.59593: calling self._execute() 11713 1726773096.61665: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11713 1726773096.61757: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11713 1726773096.61816: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11713 1726773096.61850: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11713 1726773096.61878: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11713 1726773096.61922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11713 1726773096.61973: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11713 1726773096.61999: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11713 1726773096.62019: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11713 1726773096.62105: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11713 1726773096.62125: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11713 1726773096.62139: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11713 1726773096.62422: when evaluation is False, skipping this task 11713 1726773096.62428: _execute() done 11713 1726773096.62430: dumping result to json 11713 1726773096.62433: done dumping result, returning 11713 1726773096.62438: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [12a3200b-1e9d-1dbd-cc52-000000000be4] 11713 1726773096.62450: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be4 11713 1726773096.62479: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be4 11713 1726773096.62484: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773096.62901: no more pending results, returning what we have 8119 1726773096.62907: results queue empty 8119 1726773096.62909: checking for any_errors_fatal 8119 1726773096.62917: done checking for any_errors_fatal 8119 1726773096.62920: checking for max_fail_percentage 8119 1726773096.62923: done checking for max_fail_percentage 8119 1726773096.62925: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.62928: done checking to see if all hosts have failed 8119 1726773096.62930: getting the remaining hosts for this loop 8119 1726773096.62933: done getting the remaining hosts for this loop 8119 1726773096.62942: building list of next tasks for hosts 8119 1726773096.62946: getting the next task for host managed_node2 8119 1726773096.62955: done getting next task for host managed_node2 8119 1726773096.62962: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773096.62968: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.62971: done building task lists 8119 1726773096.62974: counting tasks in each state of execution 8119 1726773096.62978: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.62981: advancing hosts in ITERATING_TASKS 8119 1726773096.62985: starting to advance hosts 8119 1726773096.62988: getting the next task for host managed_node2 8119 1726773096.62993: done getting next task for host managed_node2 8119 1726773096.62996: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773096.63001: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.63003: done advancing hosts to next task 8119 1726773096.63019: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.63025: getting variables 8119 1726773096.63029: in VariableManager get_vars() 8119 1726773096.63067: Calling all_inventory to load vars for managed_node2 8119 1726773096.63073: Calling groups_inventory to load vars for managed_node2 8119 1726773096.63077: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.63109: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63126: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.63143: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63158: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.63176: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63188: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.63204: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63233: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63254: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.63568: done with get_vars() 8119 1726773096.63583: done getting variables 8119 1726773096.63590: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.63592: done copying, going to template now 8119 1726773096.63594: done templating 8119 1726773096.63596: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.045) 0:01:31.192 **** 8119 1726773096.63614: sending task start callback 8119 1726773096.63616: entering _queue_task() for managed_node2/fail 8119 1726773096.63757: worker is 1 (out of 1 available) 8119 1726773096.63798: exiting _queue_task() for managed_node2/fail 8119 1726773096.63869: done queuing things up, now waiting for results queue to drain 8119 1726773096.63874: waiting for pending results... 11717 1726773096.63942: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11717 1726773096.64009: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000be5 11717 1726773096.64053: calling self._execute() 11717 1726773096.65872: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11717 1726773096.65963: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11717 1726773096.66023: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11717 1726773096.66051: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11717 1726773096.66093: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11717 1726773096.66127: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11717 1726773096.66170: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11717 1726773096.66197: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11717 1726773096.66217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11717 1726773096.66299: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11717 1726773096.66321: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11717 1726773096.66335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11717 1726773096.66603: when evaluation is False, skipping this task 11717 1726773096.66610: _execute() done 11717 1726773096.66612: dumping result to json 11717 1726773096.66613: done dumping result, returning 11717 1726773096.66618: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [12a3200b-1e9d-1dbd-cc52-000000000be5] 11717 1726773096.66627: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be5 11717 1726773096.66657: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000be5 11717 1726773096.66660: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773096.66887: no more pending results, returning what we have 8119 1726773096.66892: results queue empty 8119 1726773096.66895: checking for any_errors_fatal 8119 1726773096.66899: done checking for any_errors_fatal 8119 1726773096.66901: checking for max_fail_percentage 8119 1726773096.66904: done checking for max_fail_percentage 8119 1726773096.66906: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.66908: done checking to see if all hosts have failed 8119 1726773096.66910: getting the remaining hosts for this loop 8119 1726773096.66913: done getting the remaining hosts for this loop 8119 1726773096.66920: building list of next tasks for hosts 8119 1726773096.66923: getting the next task for host managed_node2 8119 1726773096.66932: done getting next task for host managed_node2 8119 1726773096.66938: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773096.66942: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.66944: done building task lists 8119 1726773096.66945: counting tasks in each state of execution 8119 1726773096.66948: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.66949: advancing hosts in ITERATING_TASKS 8119 1726773096.66951: starting to advance hosts 8119 1726773096.66952: getting the next task for host managed_node2 8119 1726773096.66955: done getting next task for host managed_node2 8119 1726773096.66957: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773096.66959: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.66960: done advancing hosts to next task 8119 1726773096.66972: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.66975: getting variables 8119 1726773096.66978: in VariableManager get_vars() 8119 1726773096.67010: Calling all_inventory to load vars for managed_node2 8119 1726773096.67015: Calling groups_inventory to load vars for managed_node2 8119 1726773096.67017: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.67040: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67050: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.67060: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67069: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.67079: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67087: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.67098: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67123: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67138: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.67368: done with get_vars() 8119 1726773096.67378: done getting variables 8119 1726773096.67386: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.67389: done copying, going to template now 8119 1726773096.67391: done templating 8119 1726773096.67392: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.037) 0:01:31.230 **** 8119 1726773096.67409: sending task start callback 8119 1726773096.67411: entering _queue_task() for managed_node2/set_fact 8119 1726773096.67553: worker is 1 (out of 1 available) 8119 1726773096.67593: exiting _queue_task() for managed_node2/set_fact 8119 1726773096.67668: done queuing things up, now waiting for results queue to drain 8119 1726773096.67674: waiting for pending results... 11719 1726773096.67738: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11719 1726773096.67799: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b6 11719 1726773096.67849: calling self._execute() 11719 1726773096.68014: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11719 1726773096.68060: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11719 1726773096.68073: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11719 1726773096.68086: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11719 1726773096.68094: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11719 1726773096.68227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11719 1726773096.68249: starting attempt loop 11719 1726773096.68252: running the handler 11719 1726773096.68276: handler run complete 11719 1726773096.68281: attempt loop complete, returning result 11719 1726773096.68287: _execute() done 11719 1726773096.68290: dumping result to json 11719 1726773096.68293: done dumping result, returning 11719 1726773096.68299: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-0000000009b6] 11719 1726773096.68307: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b6 11719 1726773096.68340: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b6 11719 1726773096.68344: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8119 1726773096.68507: no more pending results, returning what we have 8119 1726773096.68512: results queue empty 8119 1726773096.68516: checking for any_errors_fatal 8119 1726773096.68522: done checking for any_errors_fatal 8119 1726773096.68524: checking for max_fail_percentage 8119 1726773096.68527: done checking for max_fail_percentage 8119 1726773096.68529: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.68531: done checking to see if all hosts have failed 8119 1726773096.68533: getting the remaining hosts for this loop 8119 1726773096.68535: done getting the remaining hosts for this loop 8119 1726773096.68543: building list of next tasks for hosts 8119 1726773096.68545: getting the next task for host managed_node2 8119 1726773096.68552: done getting next task for host managed_node2 8119 1726773096.68556: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773096.68561: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.68563: done building task lists 8119 1726773096.68565: counting tasks in each state of execution 8119 1726773096.68569: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.68571: advancing hosts in ITERATING_TASKS 8119 1726773096.68573: starting to advance hosts 8119 1726773096.68575: getting the next task for host managed_node2 8119 1726773096.68578: done getting next task for host managed_node2 8119 1726773096.68581: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773096.68586: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.68589: done advancing hosts to next task 8119 1726773096.68600: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.68603: getting variables 8119 1726773096.68605: in VariableManager get_vars() 8119 1726773096.68631: Calling all_inventory to load vars for managed_node2 8119 1726773096.68635: Calling groups_inventory to load vars for managed_node2 8119 1726773096.68637: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.68657: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68667: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.68677: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68688: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.68700: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68706: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.68717: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68736: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68749: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.68956: done with get_vars() 8119 1726773096.68967: done getting variables 8119 1726773096.68972: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.68974: done copying, going to template now 8119 1726773096.68976: done templating 8119 1726773096.68977: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.015) 0:01:31.246 **** 8119 1726773096.68995: sending task start callback 8119 1726773096.68997: entering _queue_task() for managed_node2/set_fact 8119 1726773096.69119: worker is 1 (out of 1 available) 8119 1726773096.69155: exiting _queue_task() for managed_node2/set_fact 8119 1726773096.69226: done queuing things up, now waiting for results queue to drain 8119 1726773096.69232: waiting for pending results... 11721 1726773096.69296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11721 1726773096.69347: in run() - task 12a3200b-1e9d-1dbd-cc52-0000000009b7 11721 1726773096.69392: calling self._execute() 11721 1726773096.71229: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11721 1726773096.71318: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11721 1726773096.71456: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11721 1726773096.71493: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11721 1726773096.71524: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11721 1726773096.71552: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11721 1726773096.71599: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11721 1726773096.71626: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11721 1726773096.71642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11721 1726773096.71726: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11721 1726773096.71744: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11721 1726773096.71758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11721 1726773096.72000: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11721 1726773096.72004: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11721 1726773096.72007: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11721 1726773096.72011: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11721 1726773096.72013: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11721 1726773096.72015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11721 1726773096.72016: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11721 1726773096.72018: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11721 1726773096.72020: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11721 1726773096.72035: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11721 1726773096.72039: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11721 1726773096.72042: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11721 1726773096.72089: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11721 1726773096.72124: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11721 1726773096.72135: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11721 1726773096.72145: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11721 1726773096.72149: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11721 1726773096.72248: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11721 1726773096.72256: starting attempt loop 11721 1726773096.72258: running the handler 11721 1726773096.72268: handler run complete 11721 1726773096.72273: attempt loop complete, returning result 11721 1726773096.72275: _execute() done 11721 1726773096.72277: dumping result to json 11721 1726773096.72279: done dumping result, returning 11721 1726773096.72285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [12a3200b-1e9d-1dbd-cc52-0000000009b7] 11721 1726773096.72293: sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b7 11721 1726773096.72325: done sending task result for task 12a3200b-1e9d-1dbd-cc52-0000000009b7 11721 1726773096.72329: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8119 1726773096.72478: no more pending results, returning what we have 8119 1726773096.72484: results queue empty 8119 1726773096.72487: checking for any_errors_fatal 8119 1726773096.72491: done checking for any_errors_fatal 8119 1726773096.72493: checking for max_fail_percentage 8119 1726773096.72496: done checking for max_fail_percentage 8119 1726773096.72498: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.72500: done checking to see if all hosts have failed 8119 1726773096.72502: getting the remaining hosts for this loop 8119 1726773096.72504: done getting the remaining hosts for this loop 8119 1726773096.72512: building list of next tasks for hosts 8119 1726773096.72514: getting the next task for host managed_node2 8119 1726773096.72523: done getting next task for host managed_node2 8119 1726773096.72526: ^ task is: TASK: Force handlers 8119 1726773096.72529: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.72531: done building task lists 8119 1726773096.72533: counting tasks in each state of execution 8119 1726773096.72537: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.72539: advancing hosts in ITERATING_TASKS 8119 1726773096.72541: starting to advance hosts 8119 1726773096.72543: getting the next task for host managed_node2 8119 1726773096.72547: done getting next task for host managed_node2 8119 1726773096.72550: ^ task is: TASK: Force handlers 8119 1726773096.72552: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.72554: done advancing hosts to next task META: ran handlers 8119 1726773096.72580: done queuing things up, now waiting for results queue to drain 8119 1726773096.72584: results queue empty 8119 1726773096.72587: checking for any_errors_fatal 8119 1726773096.72590: done checking for any_errors_fatal 8119 1726773096.72592: checking for max_fail_percentage 8119 1726773096.72594: done checking for max_fail_percentage 8119 1726773096.72596: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.72598: done checking to see if all hosts have failed 8119 1726773096.72600: getting the remaining hosts for this loop 8119 1726773096.72602: done getting the remaining hosts for this loop 8119 1726773096.72608: building list of next tasks for hosts 8119 1726773096.72611: getting the next task for host managed_node2 8119 1726773096.72614: done getting next task for host managed_node2 8119 1726773096.72616: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773096.72619: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.72621: done building task lists 8119 1726773096.72623: counting tasks in each state of execution 8119 1726773096.72625: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.72627: advancing hosts in ITERATING_TASKS 8119 1726773096.72629: starting to advance hosts 8119 1726773096.72631: getting the next task for host managed_node2 8119 1726773096.72634: done getting next task for host managed_node2 8119 1726773096.72636: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8119 1726773096.72638: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.72640: done advancing hosts to next task 8119 1726773096.72648: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.72650: getting variables 8119 1726773096.72652: in VariableManager get_vars() 8119 1726773096.72676: Calling all_inventory to load vars for managed_node2 8119 1726773096.72679: Calling groups_inventory to load vars for managed_node2 8119 1726773096.72681: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.72707: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.72721: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.72732: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.72740: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.72751: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.72757: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.72768: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.72789: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.72807: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.73026: done with get_vars() 8119 1726773096.73038: done getting variables 8119 1726773096.73043: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.73045: done copying, going to template now 8119 1726773096.73047: done templating 8119 1726773096.73048: here goes the callback... TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:194 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.040) 0:01:31.287 **** 8119 1726773096.73063: sending task start callback 8119 1726773096.73065: entering _queue_task() for managed_node2/assert 8119 1726773096.73201: worker is 1 (out of 1 available) 8119 1726773096.73238: exiting _queue_task() for managed_node2/assert 8119 1726773096.73311: done queuing things up, now waiting for results queue to drain 8119 1726773096.73317: waiting for pending results... 11723 1726773096.73377: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11723 1726773096.73426: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000002b 11723 1726773096.73473: calling self._execute() 11723 1726773096.73633: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11723 1726773096.73672: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11723 1726773096.73687: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11723 1726773096.73700: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11723 1726773096.73707: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11723 1726773096.73836: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11723 1726773096.73861: starting attempt loop 11723 1726773096.73864: running the handler 11723 1726773096.75557: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11723 1726773096.75665: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11723 1726773096.75721: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11723 1726773096.75752: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11723 1726773096.75778: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11723 1726773096.75813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11723 1726773096.75860: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11723 1726773096.75886: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11723 1726773096.75903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11723 1726773096.75988: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11723 1726773096.76006: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11723 1726773096.76034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11723 1726773096.76314: handler run complete 11723 1726773096.76320: attempt loop complete, returning result 11723 1726773096.76323: _execute() done 11723 1726773096.76324: dumping result to json 11723 1726773096.76326: done dumping result, returning 11723 1726773096.76330: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [12a3200b-1e9d-1dbd-cc52-00000000002b] 11723 1726773096.76339: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002b 11723 1726773096.76367: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002b 11723 1726773096.76371: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773096.76539: no more pending results, returning what we have 8119 1726773096.76543: results queue empty 8119 1726773096.76546: checking for any_errors_fatal 8119 1726773096.76549: done checking for any_errors_fatal 8119 1726773096.76551: checking for max_fail_percentage 8119 1726773096.76554: done checking for max_fail_percentage 8119 1726773096.76556: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.76557: done checking to see if all hosts have failed 8119 1726773096.76559: getting the remaining hosts for this loop 8119 1726773096.76562: done getting the remaining hosts for this loop 8119 1726773096.76569: building list of next tasks for hosts 8119 1726773096.76572: getting the next task for host managed_node2 8119 1726773096.76578: done getting next task for host managed_node2 8119 1726773096.76581: ^ task is: TASK: Ensure role reported changed 8119 1726773096.76587: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.76589: done building task lists 8119 1726773096.76591: counting tasks in each state of execution 8119 1726773096.76595: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.76597: advancing hosts in ITERATING_TASKS 8119 1726773096.76599: starting to advance hosts 8119 1726773096.76601: getting the next task for host managed_node2 8119 1726773096.76604: done getting next task for host managed_node2 8119 1726773096.76607: ^ task is: TASK: Ensure role reported changed 8119 1726773096.76609: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.76611: done advancing hosts to next task 8119 1726773096.76625: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.76629: getting variables 8119 1726773096.76632: in VariableManager get_vars() 8119 1726773096.76665: Calling all_inventory to load vars for managed_node2 8119 1726773096.76671: Calling groups_inventory to load vars for managed_node2 8119 1726773096.76673: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.76697: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76708: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.76721: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76730: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.76740: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76746: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.76755: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76772: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76790: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.76997: done with get_vars() 8119 1726773096.77007: done getting variables 8119 1726773096.77012: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.77014: done copying, going to template now 8119 1726773096.77017: done templating 8119 1726773096.77019: here goes the callback... TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:198 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.039) 0:01:31.326 **** 8119 1726773096.77035: sending task start callback 8119 1726773096.77037: entering _queue_task() for managed_node2/assert 8119 1726773096.77165: worker is 1 (out of 1 available) 8119 1726773096.77205: exiting _queue_task() for managed_node2/assert 8119 1726773096.77277: done queuing things up, now waiting for results queue to drain 8119 1726773096.77282: waiting for pending results... 11725 1726773096.77343: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11725 1726773096.77385: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000002c 11725 1726773096.77436: calling self._execute() 11725 1726773096.77590: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11725 1726773096.77635: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11725 1726773096.77651: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11725 1726773096.77663: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11725 1726773096.77671: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11725 1726773096.77802: Loading ActionModule 'assert' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11725 1726773096.77827: starting attempt loop 11725 1726773096.77830: running the handler 11725 1726773096.79766: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11725 1726773096.79875: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11725 1726773096.79946: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11725 1726773096.79988: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11725 1726773096.80033: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11725 1726773096.80089: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11725 1726773096.80152: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11725 1726773096.80184: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11725 1726773096.80210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11725 1726773096.80325: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11725 1726773096.80349: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11725 1726773096.80372: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11725 1726773096.80743: handler run complete 11725 1726773096.80751: attempt loop complete, returning result 11725 1726773096.80755: _execute() done 11725 1726773096.80758: dumping result to json 11725 1726773096.80760: done dumping result, returning 11725 1726773096.80767: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [12a3200b-1e9d-1dbd-cc52-00000000002c] 11725 1726773096.80779: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002c 11725 1726773096.80822: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002c 11725 1726773096.80827: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8119 1726773096.81018: no more pending results, returning what we have 8119 1726773096.81023: results queue empty 8119 1726773096.81025: checking for any_errors_fatal 8119 1726773096.81029: done checking for any_errors_fatal 8119 1726773096.81031: checking for max_fail_percentage 8119 1726773096.81034: done checking for max_fail_percentage 8119 1726773096.81036: checking to see if all hosts have failed and the running result is not ok 8119 1726773096.81038: done checking to see if all hosts have failed 8119 1726773096.81040: getting the remaining hosts for this loop 8119 1726773096.81043: done getting the remaining hosts for this loop 8119 1726773096.81056: building list of next tasks for hosts 8119 1726773096.81060: getting the next task for host managed_node2 8119 1726773096.81067: done getting next task for host managed_node2 8119 1726773096.81070: ^ task is: TASK: Check sysctl 8119 1726773096.81073: ^ state is: HOST STATE: block=2, task=46, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.81075: done building task lists 8119 1726773096.81077: counting tasks in each state of execution 8119 1726773096.81081: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773096.81084: advancing hosts in ITERATING_TASKS 8119 1726773096.81087: starting to advance hosts 8119 1726773096.81089: getting the next task for host managed_node2 8119 1726773096.81092: done getting next task for host managed_node2 8119 1726773096.81095: ^ task is: TASK: Check sysctl 8119 1726773096.81097: ^ state is: HOST STATE: block=2, task=46, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773096.81100: done advancing hosts to next task 8119 1726773096.81117: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773096.81121: getting variables 8119 1726773096.81124: in VariableManager get_vars() 8119 1726773096.81148: Calling all_inventory to load vars for managed_node2 8119 1726773096.81152: Calling groups_inventory to load vars for managed_node2 8119 1726773096.81154: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773096.81175: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81188: Calling all_plugins_play to load vars for managed_node2 8119 1726773096.81199: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81209: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773096.81224: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81231: Calling groups_plugins_play to load vars for managed_node2 8119 1726773096.81241: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81258: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81272: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773096.81471: done with get_vars() 8119 1726773096.81481: done getting variables 8119 1726773096.81488: sending task start callback, copying the task so we can template it temporarily 8119 1726773096.81490: done copying, going to template now 8119 1726773096.81492: done templating 8119 1726773096.81493: here goes the callback... TASK [Check sysctl] ************************************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:202 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.044) 0:01:31.371 **** 8119 1726773096.81508: sending task start callback 8119 1726773096.81510: entering _queue_task() for managed_node2/shell 8119 1726773096.81634: worker is 1 (out of 1 available) 8119 1726773096.81674: exiting _queue_task() for managed_node2/shell 8119 1726773096.81748: done queuing things up, now waiting for results queue to drain 8119 1726773096.81754: waiting for pending results... 11729 1726773096.81806: running TaskExecutor() for managed_node2/TASK: Check sysctl 11729 1726773096.81848: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000002d 11729 1726773096.81896: calling self._execute() 11729 1726773096.82045: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11729 1726773096.82085: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11729 1726773096.82096: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11729 1726773096.82108: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11729 1726773096.82117: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11729 1726773096.82280: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11729 1726773096.82310: starting attempt loop 11729 1726773096.82315: running the handler 11729 1726773096.82323: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11729 1726773096.82341: _low_level_execute_command(): starting 11729 1726773096.82348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11729 1726773096.85127: stdout chunk (state=2): >>>/root <<< 11729 1726773096.85242: stderr chunk (state=3): >>><<< 11729 1726773096.85248: stdout chunk (state=3): >>><<< 11729 1726773096.85270: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11729 1726773096.85291: _low_level_execute_command(): starting 11729 1726773096.85299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442 `" && echo ansible-tmp-1726773096.8528109-11729-88458755639442="` echo /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442 `" ) && sleep 0' 11729 1726773096.88006: stdout chunk (state=2): >>>ansible-tmp-1726773096.8528109-11729-88458755639442=/root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442 <<< 11729 1726773096.88150: stderr chunk (state=3): >>><<< 11729 1726773096.88156: stdout chunk (state=3): >>><<< 11729 1726773096.88175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773096.8528109-11729-88458755639442=/root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442 , stderr= 11729 1726773096.88352: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11729 1726773096.88416: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/AnsiballZ_command.py 11729 1726773096.88747: Sending initial data 11729 1726773096.88762: Sent initial data (154 bytes) 11729 1726773096.91227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpeguc7l8a /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/AnsiballZ_command.py <<< 11729 1726773096.92222: stderr chunk (state=3): >>><<< 11729 1726773096.92228: stdout chunk (state=3): >>><<< 11729 1726773096.92254: done transferring module to remote 11729 1726773096.92268: _low_level_execute_command(): starting 11729 1726773096.92273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/ /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/AnsiballZ_command.py && sleep 0' 11729 1726773096.94828: stderr chunk (state=2): >>><<< 11729 1726773096.94840: stdout chunk (state=2): >>><<< 11729 1726773096.94860: _low_level_execute_command() done: rc=0, stdout=, stderr= 11729 1726773096.94864: _low_level_execute_command(): starting 11729 1726773096.94874: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/AnsiballZ_command.py && sleep 0' 11729 1726773097.10051: stdout chunk (state=2): >>> {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.093169", "end": "2024-09-19 15:11:37.098684", "delta": "0:00:00.005515", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11729 1726773097.11084: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11729 1726773097.11128: stderr chunk (state=3): >>><<< 11729 1726773097.11136: stdout chunk (state=3): >>><<< 11729 1726773097.11160: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.093169", "end": "2024-09-19 15:11:37.098684", "delta": "0:00:00.005515", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11729 1726773097.11197: done with _execute_module (command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11729 1726773097.11209: _low_level_execute_command(): starting 11729 1726773097.11215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773096.8528109-11729-88458755639442/ > /dev/null 2>&1 && sleep 0' 11729 1726773097.14296: stderr chunk (state=2): >>><<< 11729 1726773097.14312: stdout chunk (state=2): >>><<< 11729 1726773097.14337: _low_level_execute_command() done: rc=0, stdout=, stderr= 11729 1726773097.14347: handler run complete 11729 1726773097.14358: attempt loop complete, returning result 11729 1726773097.14372: _execute() done 11729 1726773097.14375: dumping result to json 11729 1726773097.14380: done dumping result, returning 11729 1726773097.14396: done running TaskExecutor() for managed_node2/TASK: Check sysctl [12a3200b-1e9d-1dbd-cc52-00000000002d] 11729 1726773097.14414: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002d 11729 1726773097.14471: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002d 11729 1726773097.14476: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "delta": "0:00:00.005515", "end": "2024-09-19 15:11:37.098684", "rc": 0, "start": "2024-09-19 15:11:37.093169" } 8119 1726773097.14842: no more pending results, returning what we have 8119 1726773097.14846: results queue empty 8119 1726773097.14849: checking for any_errors_fatal 8119 1726773097.14853: done checking for any_errors_fatal 8119 1726773097.14855: checking for max_fail_percentage 8119 1726773097.14858: done checking for max_fail_percentage 8119 1726773097.14861: checking to see if all hosts have failed and the running result is not ok 8119 1726773097.14863: done checking to see if all hosts have failed 8119 1726773097.14865: getting the remaining hosts for this loop 8119 1726773097.14868: done getting the remaining hosts for this loop 8119 1726773097.14875: building list of next tasks for hosts 8119 1726773097.14878: getting the next task for host managed_node2 8119 1726773097.14886: done getting next task for host managed_node2 8119 1726773097.14889: ^ task is: TASK: Check sysfs after role runs 8119 1726773097.14893: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773097.14895: done building task lists 8119 1726773097.14896: counting tasks in each state of execution 8119 1726773097.14900: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773097.14902: advancing hosts in ITERATING_TASKS 8119 1726773097.14904: starting to advance hosts 8119 1726773097.14906: getting the next task for host managed_node2 8119 1726773097.14911: done getting next task for host managed_node2 8119 1726773097.14914: ^ task is: TASK: Check sysfs after role runs 8119 1726773097.14916: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773097.14918: done advancing hosts to next task 8119 1726773097.14932: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773097.14935: getting variables 8119 1726773097.14938: in VariableManager get_vars() 8119 1726773097.14969: Calling all_inventory to load vars for managed_node2 8119 1726773097.14974: Calling groups_inventory to load vars for managed_node2 8119 1726773097.14978: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.15012: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15031: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.15050: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15065: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.15085: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15097: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.15119: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15152: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15176: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.15551: done with get_vars() 8119 1726773097.15565: done getting variables 8119 1726773097.15572: sending task start callback, copying the task so we can template it temporarily 8119 1726773097.15575: done copying, going to template now 8119 1726773097.15578: done templating 8119 1726773097.15580: here goes the callback... TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:208 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.340) 0:01:31.712 **** 8119 1726773097.15605: sending task start callback 8119 1726773097.15611: entering _queue_task() for managed_node2/command 8119 1726773097.15791: worker is 1 (out of 1 available) 8119 1726773097.15831: exiting _queue_task() for managed_node2/command 8119 1726773097.15907: done queuing things up, now waiting for results queue to drain 8119 1726773097.15915: waiting for pending results... 11749 1726773097.16192: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11749 1726773097.16248: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000002e 11749 1726773097.16305: calling self._execute() 11749 1726773097.16511: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11749 1726773097.16573: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11749 1726773097.16592: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11749 1726773097.16613: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11749 1726773097.16626: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11749 1726773097.16802: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11749 1726773097.16832: starting attempt loop 11749 1726773097.16837: running the handler 11749 1726773097.16851: _low_level_execute_command(): starting 11749 1726773097.16858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11749 1726773097.19798: stdout chunk (state=2): >>>/root <<< 11749 1726773097.19920: stderr chunk (state=3): >>><<< 11749 1726773097.19926: stdout chunk (state=3): >>><<< 11749 1726773097.19947: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11749 1726773097.19962: _low_level_execute_command(): starting 11749 1726773097.19968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710 `" && echo ansible-tmp-1726773097.1995494-11749-48669983232710="` echo /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710 `" ) && sleep 0' 11749 1726773097.23021: stdout chunk (state=2): >>>ansible-tmp-1726773097.1995494-11749-48669983232710=/root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710 <<< 11749 1726773097.23157: stderr chunk (state=3): >>><<< 11749 1726773097.23165: stdout chunk (state=3): >>><<< 11749 1726773097.23192: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773097.1995494-11749-48669983232710=/root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710 , stderr= 11749 1726773097.23341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11749 1726773097.23404: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/AnsiballZ_command.py 11749 1726773097.24112: Sending initial data 11749 1726773097.24127: Sent initial data (154 bytes) 11749 1726773097.26808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpwocuipz3 /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/AnsiballZ_command.py <<< 11749 1726773097.28223: stderr chunk (state=3): >>><<< 11749 1726773097.28232: stdout chunk (state=3): >>><<< 11749 1726773097.28263: done transferring module to remote 11749 1726773097.28282: _low_level_execute_command(): starting 11749 1726773097.28295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/ /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/AnsiballZ_command.py && sleep 0' 11749 1726773097.31347: stderr chunk (state=2): >>><<< 11749 1726773097.31360: stdout chunk (state=2): >>><<< 11749 1726773097.31384: _low_level_execute_command() done: rc=0, stdout=, stderr= 11749 1726773097.31389: _low_level_execute_command(): starting 11749 1726773097.31396: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/AnsiballZ_command.py && sleep 0' 11749 1726773097.46425: stdout chunk (state=2): >>> {"cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.459242", "end": "2024-09-19 15:11:37.462342", "delta": "0:00:00.003100", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11749 1726773097.47477: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11749 1726773097.47527: stderr chunk (state=3): >>><<< 11749 1726773097.47532: stdout chunk (state=3): >>><<< 11749 1726773097.47559: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.459242", "end": "2024-09-19 15:11:37.462342", "delta": "0:00:00.003100", "changed": true, "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11749 1726773097.47597: done with _execute_module (command, {'_raw_params': 'grep -Lxqv 60666 /sys/class/net/lo/mtu', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11749 1726773097.47612: _low_level_execute_command(): starting 11749 1726773097.47617: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773097.1995494-11749-48669983232710/ > /dev/null 2>&1 && sleep 0' 11749 1726773097.50273: stderr chunk (state=2): >>><<< 11749 1726773097.50287: stdout chunk (state=2): >>><<< 11749 1726773097.50310: _low_level_execute_command() done: rc=0, stdout=, stderr= 11749 1726773097.50318: handler run complete 11749 1726773097.50328: attempt loop complete, returning result 11749 1726773097.50341: _execute() done 11749 1726773097.50343: dumping result to json 11749 1726773097.50347: done dumping result, returning 11749 1726773097.50359: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [12a3200b-1e9d-1dbd-cc52-00000000002e] 11749 1726773097.50378: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002e 11749 1726773097.50423: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002e 11749 1726773097.50468: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003100", "end": "2024-09-19 15:11:37.462342", "rc": 0, "start": "2024-09-19 15:11:37.459242" } 8119 1726773097.50586: no more pending results, returning what we have 8119 1726773097.50593: results queue empty 8119 1726773097.50595: checking for any_errors_fatal 8119 1726773097.50602: done checking for any_errors_fatal 8119 1726773097.50604: checking for max_fail_percentage 8119 1726773097.50607: done checking for max_fail_percentage 8119 1726773097.50610: checking to see if all hosts have failed and the running result is not ok 8119 1726773097.50612: done checking to see if all hosts have failed 8119 1726773097.50614: getting the remaining hosts for this loop 8119 1726773097.50616: done getting the remaining hosts for this loop 8119 1726773097.50624: building list of next tasks for hosts 8119 1726773097.50626: getting the next task for host managed_node2 8119 1726773097.50633: done getting next task for host managed_node2 8119 1726773097.50636: ^ task is: TASK: Cleanup 8119 1726773097.50640: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=1, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773097.50642: done building task lists 8119 1726773097.50644: counting tasks in each state of execution 8119 1726773097.50647: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773097.50649: advancing hosts in ITERATING_ALWAYS 8119 1726773097.50651: starting to advance hosts 8119 1726773097.50653: getting the next task for host managed_node2 8119 1726773097.50656: done getting next task for host managed_node2 8119 1726773097.50658: ^ task is: TASK: Cleanup 8119 1726773097.50660: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=1, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773097.50662: done advancing hosts to next task 8119 1726773097.50676: getting variables 8119 1726773097.50679: in VariableManager get_vars() 8119 1726773097.50715: Calling all_inventory to load vars for managed_node2 8119 1726773097.50721: Calling groups_inventory to load vars for managed_node2 8119 1726773097.50725: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.50753: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.50768: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.50786: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.50801: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.50817: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.50827: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.50838: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.50856: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.50869: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.51088: done with get_vars() 8119 1726773097.51099: done getting variables 8119 1726773097.51102: sending task start callback, copying the task so we can template it temporarily 8119 1726773097.51104: done copying, going to template now 8119 1726773097.51106: done templating 8119 1726773097.51107: here goes the callback... TASK [Cleanup] ***************************************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:213 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.355) 0:01:32.067 **** 8119 1726773097.51127: sending task start callback 8119 1726773097.51129: entering _queue_task() for managed_node2/include_tasks 8119 1726773097.51253: worker is 1 (out of 1 available) 8119 1726773097.51293: exiting _queue_task() for managed_node2/include_tasks 8119 1726773097.51365: done queuing things up, now waiting for results queue to drain 8119 1726773097.51370: waiting for pending results... 11764 1726773097.51431: running TaskExecutor() for managed_node2/TASK: Cleanup 11764 1726773097.51477: in run() - task 12a3200b-1e9d-1dbd-cc52-00000000002f 11764 1726773097.51529: calling self._execute() 11764 1726773097.51633: _execute() done 11764 1726773097.51638: dumping result to json 11764 1726773097.51640: done dumping result, returning 11764 1726773097.51644: done running TaskExecutor() for managed_node2/TASK: Cleanup [12a3200b-1e9d-1dbd-cc52-00000000002f] 11764 1726773097.51653: sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002f 11764 1726773097.51682: done sending task result for task 12a3200b-1e9d-1dbd-cc52-00000000002f 11764 1726773097.51716: WORKER PROCESS EXITING 8119 1726773097.51810: no more pending results, returning what we have 8119 1726773097.51819: in VariableManager get_vars() 8119 1726773097.51864: Calling all_inventory to load vars for managed_node2 8119 1726773097.51870: Calling groups_inventory to load vars for managed_node2 8119 1726773097.51874: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.51907: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.51925: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.51943: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.51956: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.51968: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.51975: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.51987: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.52007: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.52024: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.52232: done with get_vars() 8119 1726773097.52250: we have included files to process 8119 1726773097.52252: generating all_blocks data 8119 1726773097.52254: done generating all_blocks data 8119 1726773097.52257: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8119 1726773097.52259: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8119 1726773097.52262: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8119 1726773097.52679: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.52724: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node2 8119 1726773097.52847: done processing included file 8119 1726773097.52849: iterating over new_blocks loaded from include file 8119 1726773097.52851: in VariableManager get_vars() 8119 1726773097.52871: done with get_vars() 8119 1726773097.52873: filtering new block on tags 8119 1726773097.52902: done filtering new block on tags 8119 1726773097.52912: in VariableManager get_vars() 8119 1726773097.52927: done with get_vars() 8119 1726773097.52929: filtering new block on tags 8119 1726773097.52992: done filtering new block on tags 8119 1726773097.53001: done iterating over new_blocks loaded from include file 8119 1726773097.53003: extending task lists for all hosts with included blocks 8119 1726773097.55181: done extending task lists 8119 1726773097.55188: done processing included files 8119 1726773097.55190: results queue empty 8119 1726773097.55191: checking for any_errors_fatal 8119 1726773097.55195: done checking for any_errors_fatal 8119 1726773097.55196: checking for max_fail_percentage 8119 1726773097.55198: done checking for max_fail_percentage 8119 1726773097.55199: checking to see if all hosts have failed and the running result is not ok 8119 1726773097.55200: done checking to see if all hosts have failed 8119 1726773097.55202: getting the remaining hosts for this loop 8119 1726773097.55204: done getting the remaining hosts for this loop 8119 1726773097.55207: building list of next tasks for hosts 8119 1726773097.55211: getting the next task for host managed_node2 8119 1726773097.55215: done getting next task for host managed_node2 8119 1726773097.55217: ^ task is: TASK: Show current tuned profile settings 8119 1726773097.55219: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=2, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.55221: done building task lists 8119 1726773097.55222: counting tasks in each state of execution 8119 1726773097.55225: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773097.55226: advancing hosts in ITERATING_TASKS 8119 1726773097.55228: starting to advance hosts 8119 1726773097.55230: getting the next task for host managed_node2 8119 1726773097.55233: done getting next task for host managed_node2 8119 1726773097.55236: ^ task is: TASK: Show current tuned profile settings 8119 1726773097.55239: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=2, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.55240: done advancing hosts to next task 8119 1726773097.55247: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773097.55249: getting variables 8119 1726773097.55250: in VariableManager get_vars() 8119 1726773097.55266: Calling all_inventory to load vars for managed_node2 8119 1726773097.55269: Calling groups_inventory to load vars for managed_node2 8119 1726773097.55271: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.55291: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55300: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.55313: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55322: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.55333: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55339: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.55351: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55371: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55387: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.55571: done with get_vars() 8119 1726773097.55586: done getting variables 8119 1726773097.55591: sending task start callback, copying the task so we can template it temporarily 8119 1726773097.55593: done copying, going to template now 8119 1726773097.55595: done templating 8119 1726773097.55596: here goes the callback... TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.044) 0:01:32.112 **** 8119 1726773097.55613: sending task start callback 8119 1726773097.55614: entering _queue_task() for managed_node2/command 8119 1726773097.55765: worker is 1 (out of 1 available) 8119 1726773097.55806: exiting _queue_task() for managed_node2/command 8119 1726773097.55885: done queuing things up, now waiting for results queue to drain 8119 1726773097.55890: waiting for pending results... 11766 1726773097.55947: running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings 11766 1726773097.55994: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c47 11766 1726773097.56042: calling self._execute() 11766 1726773097.57838: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11766 1726773097.57923: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11766 1726773097.57986: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11766 1726773097.58017: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11766 1726773097.58043: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11766 1726773097.58073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11766 1726773097.58122: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11766 1726773097.58146: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11766 1726773097.58161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11766 1726773097.58244: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11766 1726773097.58260: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11766 1726773097.58275: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11766 1726773097.58563: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11766 1726773097.58600: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11766 1726773097.58612: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11766 1726773097.58623: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11766 1726773097.58628: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11766 1726773097.58760: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11766 1726773097.58782: starting attempt loop 11766 1726773097.58788: running the handler 11766 1726773097.58802: _low_level_execute_command(): starting 11766 1726773097.58808: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11766 1726773097.61624: stdout chunk (state=2): >>>/root <<< 11766 1726773097.61754: stderr chunk (state=3): >>><<< 11766 1726773097.61760: stdout chunk (state=3): >>><<< 11766 1726773097.61785: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11766 1726773097.61807: _low_level_execute_command(): starting 11766 1726773097.61819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473 `" && echo ansible-tmp-1726773097.6179626-11766-97317173390473="` echo /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473 `" ) && sleep 0' 11766 1726773097.65098: stdout chunk (state=2): >>>ansible-tmp-1726773097.6179626-11766-97317173390473=/root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473 <<< 11766 1726773097.65120: stderr chunk (state=2): >>><<< 11766 1726773097.65134: stdout chunk (state=3): >>><<< 11766 1726773097.65154: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773097.6179626-11766-97317173390473=/root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473 , stderr= 11766 1726773097.65302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 11766 1726773097.65376: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/AnsiballZ_command.py 11766 1726773097.66202: Sending initial data 11766 1726773097.66218: Sent initial data (154 bytes) 11766 1726773097.68651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpfy3dsiuy /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/AnsiballZ_command.py <<< 11766 1726773097.70227: stderr chunk (state=3): >>><<< 11766 1726773097.70238: stdout chunk (state=3): >>><<< 11766 1726773097.70270: done transferring module to remote 11766 1726773097.70293: _low_level_execute_command(): starting 11766 1726773097.70301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/ /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/AnsiballZ_command.py && sleep 0' 11766 1726773097.73498: stderr chunk (state=2): >>><<< 11766 1726773097.73517: stdout chunk (state=2): >>><<< 11766 1726773097.73545: _low_level_execute_command() done: rc=0, stdout=, stderr= 11766 1726773097.73551: _low_level_execute_command(): starting 11766 1726773097.73564: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/AnsiballZ_command.py && sleep 0' 11766 1726773097.88592: stdout chunk (state=2): >>> {"cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.881337", "end": "2024-09-19 15:11:37.884025", "delta": "0:00:00.002688", "changed": true, "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11766 1726773097.89616: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11766 1726773097.89658: stderr chunk (state=3): >>><<< 11766 1726773097.89664: stdout chunk (state=3): >>><<< 11766 1726773097.89687: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:37.881337", "end": "2024-09-19 15:11:37.884025", "delta": "0:00:00.002688", "changed": true, "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11766 1726773097.89727: done with _execute_module (command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11766 1726773097.89739: _low_level_execute_command(): starting 11766 1726773097.89744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773097.6179626-11766-97317173390473/ > /dev/null 2>&1 && sleep 0' 11766 1726773097.92401: stderr chunk (state=2): >>><<< 11766 1726773097.92417: stdout chunk (state=2): >>><<< 11766 1726773097.92437: _low_level_execute_command() done: rc=0, stdout=, stderr= 11766 1726773097.92444: handler run complete 11766 1726773097.92453: attempt loop complete, returning result 11766 1726773097.92466: _execute() done 11766 1726773097.92467: dumping result to json 11766 1726773097.92472: done dumping result, returning 11766 1726773097.92482: done running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings [12a3200b-1e9d-1dbd-cc52-000000000c47] 11766 1726773097.92497: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c47 11766 1726773097.92542: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c47 11766 1726773097.92546: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/tuned/kernel_settings/tuned.conf" ], "delta": "0:00:00.002688", "end": "2024-09-19 15:11:37.884025", "rc": 0, "start": "2024-09-19 15:11:37.881337" } STDOUT: # # Ansible managed # # system_role:kernel_settings [main] summary = kernel settings [vm] transparent_hugepages = never 8119 1726773097.92729: no more pending results, returning what we have 8119 1726773097.92733: results queue empty 8119 1726773097.92735: checking for any_errors_fatal 8119 1726773097.92738: done checking for any_errors_fatal 8119 1726773097.92739: checking for max_fail_percentage 8119 1726773097.92742: done checking for max_fail_percentage 8119 1726773097.92743: checking to see if all hosts have failed and the running result is not ok 8119 1726773097.92745: done checking to see if all hosts have failed 8119 1726773097.92746: getting the remaining hosts for this loop 8119 1726773097.92748: done getting the remaining hosts for this loop 8119 1726773097.92754: building list of next tasks for hosts 8119 1726773097.92756: getting the next task for host managed_node2 8119 1726773097.92767: done getting next task for host managed_node2 8119 1726773097.92770: ^ task is: TASK: Run role with purge to remove everything 8119 1726773097.92773: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.92775: done building task lists 8119 1726773097.92776: counting tasks in each state of execution 8119 1726773097.92780: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773097.92781: advancing hosts in ITERATING_TASKS 8119 1726773097.92785: starting to advance hosts 8119 1726773097.92787: getting the next task for host managed_node2 8119 1726773097.92793: done getting next task for host managed_node2 8119 1726773097.92796: ^ task is: TASK: Run role with purge to remove everything 8119 1726773097.92800: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.92802: done advancing hosts to next task 8119 1726773097.92818: getting variables 8119 1726773097.92822: in VariableManager get_vars() 8119 1726773097.92858: Calling all_inventory to load vars for managed_node2 8119 1726773097.92864: Calling groups_inventory to load vars for managed_node2 8119 1726773097.92867: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.92897: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.92913: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.92929: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.92942: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.92957: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.92966: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.92980: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.93009: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.93032: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.93244: done with get_vars() 8119 1726773097.93254: done getting variables 8119 1726773097.93258: sending task start callback, copying the task so we can template it temporarily 8119 1726773097.93260: done copying, going to template now 8119 1726773097.93262: done templating 8119 1726773097.93263: here goes the callback... TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.376) 0:01:32.489 **** 8119 1726773097.93278: sending task start callback 8119 1726773097.93280: entering _queue_task() for managed_node2/include_role 8119 1726773097.93410: worker is 1 (out of 1 available) 8119 1726773097.93451: exiting _queue_task() for managed_node2/include_role 8119 1726773097.93524: done queuing things up, now waiting for results queue to drain 8119 1726773097.93530: waiting for pending results... 11803 1726773097.93592: running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything 11803 1726773097.93645: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c49 11803 1726773097.93692: calling self._execute() 11803 1726773097.93795: _execute() done 11803 1726773097.93800: dumping result to json 11803 1726773097.93803: done dumping result, returning 11803 1726773097.93806: done running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything [12a3200b-1e9d-1dbd-cc52-000000000c49] 11803 1726773097.93818: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c49 11803 1726773097.93847: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c49 11803 1726773097.93851: WORKER PROCESS EXITING 8119 1726773097.94061: no more pending results, returning what we have 8119 1726773097.94068: in VariableManager get_vars() 8119 1726773097.94099: Calling all_inventory to load vars for managed_node2 8119 1726773097.94103: Calling groups_inventory to load vars for managed_node2 8119 1726773097.94105: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.94128: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94138: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.94148: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94156: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.94166: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94172: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.94181: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94203: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94219: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.94435: done with get_vars() 8119 1726773097.94597: we have included files to process 8119 1726773097.94600: generating all_blocks data 8119 1726773097.94603: done generating all_blocks data 8119 1726773097.94605: processing included file: fedora.linux_system_roles.kernel_settings 8119 1726773097.94619: in VariableManager get_vars() 8119 1726773097.94635: done with get_vars() 8119 1726773097.94686: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8119 1726773097.94732: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8119 1726773097.94752: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8119 1726773097.94806: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8119 1726773097.95147: in VariableManager get_vars() 8119 1726773097.95167: done with get_vars() 8119 1726773097.95324: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.95372: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.95470: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.95512: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.95627: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773097.95768: in VariableManager get_vars() 8119 1726773097.95791: done with get_vars() 8119 1726773097.95861: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8119 1726773097.96217: iterating over new_blocks loaded from include file 8119 1726773097.96222: in VariableManager get_vars() 8119 1726773097.96244: done with get_vars() 8119 1726773097.96247: filtering new block on tags 8119 1726773097.96325: done filtering new block on tags 8119 1726773097.96340: in VariableManager get_vars() 8119 1726773097.96362: done with get_vars() 8119 1726773097.96366: filtering new block on tags 8119 1726773097.96453: done filtering new block on tags 8119 1726773097.96479: in VariableManager get_vars() 8119 1726773097.96505: done with get_vars() 8119 1726773097.96510: filtering new block on tags 8119 1726773097.96702: done filtering new block on tags 8119 1726773097.96716: done iterating over new_blocks loaded from include file 8119 1726773097.96720: extending task lists for all hosts with included blocks 8119 1726773097.96989: done extending task lists 8119 1726773097.96993: done processing included files 8119 1726773097.96996: results queue empty 8119 1726773097.96998: checking for any_errors_fatal 8119 1726773097.97003: done checking for any_errors_fatal 8119 1726773097.97005: checking for max_fail_percentage 8119 1726773097.97007: done checking for max_fail_percentage 8119 1726773097.97009: checking to see if all hosts have failed and the running result is not ok 8119 1726773097.97011: done checking to see if all hosts have failed 8119 1726773097.97013: getting the remaining hosts for this loop 8119 1726773097.97016: done getting the remaining hosts for this loop 8119 1726773097.97022: building list of next tasks for hosts 8119 1726773097.97024: getting the next task for host managed_node2 8119 1726773097.97030: done getting next task for host managed_node2 8119 1726773097.97034: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773097.97042: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.97048: done building task lists 8119 1726773097.97050: counting tasks in each state of execution 8119 1726773097.97054: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773097.97057: advancing hosts in ITERATING_TASKS 8119 1726773097.97059: starting to advance hosts 8119 1726773097.97061: getting the next task for host managed_node2 8119 1726773097.97067: done getting next task for host managed_node2 8119 1726773097.97070: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8119 1726773097.97074: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773097.97077: done advancing hosts to next task 8119 1726773097.97087: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773097.97092: getting variables 8119 1726773097.97095: in VariableManager get_vars() 8119 1726773097.97115: Calling all_inventory to load vars for managed_node2 8119 1726773097.97121: Calling groups_inventory to load vars for managed_node2 8119 1726773097.97125: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773097.97145: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97155: Calling all_plugins_play to load vars for managed_node2 8119 1726773097.97173: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97193: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773097.97206: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97213: Calling groups_plugins_play to load vars for managed_node2 8119 1726773097.97223: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97240: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97253: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773097.97437: done with get_vars() 8119 1726773097.97447: done getting variables 8119 1726773097.97451: sending task start callback, copying the task so we can template it temporarily 8119 1726773097.97453: done copying, going to template now 8119 1726773097.97455: done templating 8119 1726773097.97457: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.041) 0:01:32.531 **** 8119 1726773097.97472: sending task start callback 8119 1726773097.97474: entering _queue_task() for managed_node2/fail 8119 1726773097.97616: worker is 1 (out of 1 available) 8119 1726773097.97653: exiting _queue_task() for managed_node2/fail 8119 1726773097.97725: done queuing things up, now waiting for results queue to drain 8119 1726773097.97730: waiting for pending results... 11807 1726773097.97798: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11807 1726773097.97851: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e58 11807 1726773097.97898: calling self._execute() 11807 1726773097.99662: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11807 1726773097.99740: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11807 1726773097.99797: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11807 1726773097.99827: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11807 1726773097.99852: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11807 1726773097.99884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11807 1726773097.99948: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11807 1726773097.99977: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11807 1726773098.00001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11807 1726773098.00100: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11807 1726773098.00125: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11807 1726773098.00287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11807 1726773098.01026: when evaluation is False, skipping this task 11807 1726773098.01031: _execute() done 11807 1726773098.01033: dumping result to json 11807 1726773098.01035: done dumping result, returning 11807 1726773098.01039: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [12a3200b-1e9d-1dbd-cc52-000000000e58] 11807 1726773098.01047: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e58 11807 1726773098.01073: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e58 11807 1726773098.01076: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.01202: no more pending results, returning what we have 8119 1726773098.01206: results queue empty 8119 1726773098.01209: checking for any_errors_fatal 8119 1726773098.01216: done checking for any_errors_fatal 8119 1726773098.01217: checking for max_fail_percentage 8119 1726773098.01220: done checking for max_fail_percentage 8119 1726773098.01222: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.01223: done checking to see if all hosts have failed 8119 1726773098.01224: getting the remaining hosts for this loop 8119 1726773098.01226: done getting the remaining hosts for this loop 8119 1726773098.01232: building list of next tasks for hosts 8119 1726773098.01234: getting the next task for host managed_node2 8119 1726773098.01242: done getting next task for host managed_node2 8119 1726773098.01246: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773098.01250: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.01252: done building task lists 8119 1726773098.01253: counting tasks in each state of execution 8119 1726773098.01257: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.01259: advancing hosts in ITERATING_TASKS 8119 1726773098.01260: starting to advance hosts 8119 1726773098.01262: getting the next task for host managed_node2 8119 1726773098.01265: done getting next task for host managed_node2 8119 1726773098.01267: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8119 1726773098.01269: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.01270: done advancing hosts to next task 8119 1726773098.01286: getting variables 8119 1726773098.01290: in VariableManager get_vars() 8119 1726773098.01325: Calling all_inventory to load vars for managed_node2 8119 1726773098.01329: Calling groups_inventory to load vars for managed_node2 8119 1726773098.01331: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.01352: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01363: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.01374: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01382: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.01403: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01416: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.01430: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01454: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01472: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.01741: done with get_vars() 8119 1726773098.01751: done getting variables 8119 1726773098.01756: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.01758: done copying, going to template now 8119 1726773098.01760: done templating 8119 1726773098.01761: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.043) 0:01:32.574 **** 8119 1726773098.01777: sending task start callback 8119 1726773098.01779: entering _queue_task() for managed_node2/include_tasks 8119 1726773098.01905: worker is 1 (out of 1 available) 8119 1726773098.01945: exiting _queue_task() for managed_node2/include_tasks 8119 1726773098.02016: done queuing things up, now waiting for results queue to drain 8119 1726773098.02022: waiting for pending results... 11809 1726773098.02074: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11809 1726773098.02132: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e59 11809 1726773098.02176: calling self._execute() 11809 1726773098.02280: _execute() done 11809 1726773098.02287: dumping result to json 11809 1726773098.02290: done dumping result, returning 11809 1726773098.02294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [12a3200b-1e9d-1dbd-cc52-000000000e59] 11809 1726773098.02303: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e59 11809 1726773098.02335: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e59 11809 1726773098.02339: WORKER PROCESS EXITING 8119 1726773098.02467: no more pending results, returning what we have 8119 1726773098.02475: in VariableManager get_vars() 8119 1726773098.02521: Calling all_inventory to load vars for managed_node2 8119 1726773098.02527: Calling groups_inventory to load vars for managed_node2 8119 1726773098.02531: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.02557: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02567: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.02578: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02588: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.02600: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02606: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.02618: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02636: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02649: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.02876: done with get_vars() 8119 1726773098.02918: we have included files to process 8119 1726773098.02921: generating all_blocks data 8119 1726773098.02923: done generating all_blocks data 8119 1726773098.02927: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773098.02929: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773098.02932: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8119 1726773098.03055: plugin lookup for setup failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773098.03119: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773098.03195: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8119 1726773098.03305: done processing included file 8119 1726773098.03307: iterating over new_blocks loaded from include file 8119 1726773098.03311: in VariableManager get_vars() 8119 1726773098.03330: done with get_vars() 8119 1726773098.03332: filtering new block on tags 8119 1726773098.03381: done filtering new block on tags 8119 1726773098.03391: in VariableManager get_vars() 8119 1726773098.03411: done with get_vars() 8119 1726773098.03413: filtering new block on tags 8119 1726773098.03464: done filtering new block on tags 8119 1726773098.03485: in VariableManager get_vars() 8119 1726773098.03505: done with get_vars() 8119 1726773098.03509: filtering new block on tags 8119 1726773098.03559: done filtering new block on tags 8119 1726773098.03566: in VariableManager get_vars() 8119 1726773098.03584: done with get_vars() 8119 1726773098.03587: filtering new block on tags 8119 1726773098.03632: done filtering new block on tags 8119 1726773098.03639: done iterating over new_blocks loaded from include file 8119 1726773098.03641: extending task lists for all hosts with included blocks 8119 1726773098.03750: done extending task lists 8119 1726773098.03753: done processing included files 8119 1726773098.03754: results queue empty 8119 1726773098.03756: checking for any_errors_fatal 8119 1726773098.03758: done checking for any_errors_fatal 8119 1726773098.03760: checking for max_fail_percentage 8119 1726773098.03761: done checking for max_fail_percentage 8119 1726773098.03762: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.03764: done checking to see if all hosts have failed 8119 1726773098.03765: getting the remaining hosts for this loop 8119 1726773098.03767: done getting the remaining hosts for this loop 8119 1726773098.03770: building list of next tasks for hosts 8119 1726773098.03772: getting the next task for host managed_node2 8119 1726773098.03776: done getting next task for host managed_node2 8119 1726773098.03778: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773098.03782: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.03786: done building task lists 8119 1726773098.03787: counting tasks in each state of execution 8119 1726773098.03790: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.03791: advancing hosts in ITERATING_TASKS 8119 1726773098.03792: starting to advance hosts 8119 1726773098.03794: getting the next task for host managed_node2 8119 1726773098.03797: done getting next task for host managed_node2 8119 1726773098.03799: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8119 1726773098.03801: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.03803: done advancing hosts to next task 8119 1726773098.03810: getting variables 8119 1726773098.03812: in VariableManager get_vars() 8119 1726773098.03824: Calling all_inventory to load vars for managed_node2 8119 1726773098.03827: Calling groups_inventory to load vars for managed_node2 8119 1726773098.03829: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.03842: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.03849: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.03858: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.03866: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.03876: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.03882: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.03894: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.03913: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.03927: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.04186: done with get_vars() 8119 1726773098.04196: done getting variables 8119 1726773098.04201: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.04202: done copying, going to template now 8119 1726773098.04204: done templating 8119 1726773098.04205: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.024) 0:01:32.598 **** 8119 1726773098.04223: sending task start callback 8119 1726773098.04225: entering _queue_task() for managed_node2/setup 8119 1726773098.04345: worker is 1 (out of 1 available) 8119 1726773098.04381: exiting _queue_task() for managed_node2/setup 8119 1726773098.04452: done queuing things up, now waiting for results queue to drain 8119 1726773098.04458: waiting for pending results... 11811 1726773098.04515: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11811 1726773098.04572: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee1 11811 1726773098.04617: calling self._execute() 11811 1726773098.06433: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11811 1726773098.06525: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11811 1726773098.06575: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11811 1726773098.06603: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11811 1726773098.06633: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11811 1726773098.06663: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11811 1726773098.06706: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11811 1726773098.06733: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11811 1726773098.06752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11811 1726773098.06829: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11811 1726773098.06846: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11811 1726773098.06861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11811 1726773098.07255: when evaluation is False, skipping this task 11811 1726773098.07260: _execute() done 11811 1726773098.07262: dumping result to json 11811 1726773098.07264: done dumping result, returning 11811 1726773098.07267: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [12a3200b-1e9d-1dbd-cc52-000000000ee1] 11811 1726773098.07275: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee1 11811 1726773098.07305: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee1 11811 1726773098.07309: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.07546: no more pending results, returning what we have 8119 1726773098.07551: results queue empty 8119 1726773098.07553: checking for any_errors_fatal 8119 1726773098.07557: done checking for any_errors_fatal 8119 1726773098.07558: checking for max_fail_percentage 8119 1726773098.07560: done checking for max_fail_percentage 8119 1726773098.07562: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.07563: done checking to see if all hosts have failed 8119 1726773098.07564: getting the remaining hosts for this loop 8119 1726773098.07566: done getting the remaining hosts for this loop 8119 1726773098.07571: building list of next tasks for hosts 8119 1726773098.07573: getting the next task for host managed_node2 8119 1726773098.07581: done getting next task for host managed_node2 8119 1726773098.07587: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773098.07592: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.07594: done building task lists 8119 1726773098.07595: counting tasks in each state of execution 8119 1726773098.07598: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.07600: advancing hosts in ITERATING_TASKS 8119 1726773098.07601: starting to advance hosts 8119 1726773098.07603: getting the next task for host managed_node2 8119 1726773098.07610: done getting next task for host managed_node2 8119 1726773098.07612: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8119 1726773098.07615: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.07616: done advancing hosts to next task 8119 1726773098.07627: getting variables 8119 1726773098.07630: in VariableManager get_vars() 8119 1726773098.07657: Calling all_inventory to load vars for managed_node2 8119 1726773098.07660: Calling groups_inventory to load vars for managed_node2 8119 1726773098.07662: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.07684: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07696: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.07707: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07718: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.07729: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07734: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.07743: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07760: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07777: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.07989: done with get_vars() 8119 1726773098.08000: done getting variables 8119 1726773098.08004: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.08006: done copying, going to template now 8119 1726773098.08010: done templating 8119 1726773098.08012: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.038) 0:01:32.636 **** 8119 1726773098.08028: sending task start callback 8119 1726773098.08029: entering _queue_task() for managed_node2/stat 8119 1726773098.08152: worker is 1 (out of 1 available) 8119 1726773098.08190: exiting _queue_task() for managed_node2/stat 8119 1726773098.08261: done queuing things up, now waiting for results queue to drain 8119 1726773098.08267: waiting for pending results... 11813 1726773098.08328: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11813 1726773098.08390: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee3 11813 1726773098.08434: calling self._execute() 11813 1726773098.10100: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11813 1726773098.10176: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11813 1726773098.10230: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11813 1726773098.10257: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11813 1726773098.10285: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11813 1726773098.10315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11813 1726773098.10367: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11813 1726773098.10393: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11813 1726773098.10411: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11813 1726773098.10486: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11813 1726773098.10516: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11813 1726773098.10533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11813 1726773098.10782: when evaluation is False, skipping this task 11813 1726773098.10789: _execute() done 11813 1726773098.10790: dumping result to json 11813 1726773098.10792: done dumping result, returning 11813 1726773098.10796: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [12a3200b-1e9d-1dbd-cc52-000000000ee3] 11813 1726773098.10804: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee3 11813 1726773098.10832: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee3 11813 1726773098.10835: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.11013: no more pending results, returning what we have 8119 1726773098.11019: results queue empty 8119 1726773098.11021: checking for any_errors_fatal 8119 1726773098.11027: done checking for any_errors_fatal 8119 1726773098.11029: checking for max_fail_percentage 8119 1726773098.11033: done checking for max_fail_percentage 8119 1726773098.11035: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.11037: done checking to see if all hosts have failed 8119 1726773098.11039: getting the remaining hosts for this loop 8119 1726773098.11042: done getting the remaining hosts for this loop 8119 1726773098.11049: building list of next tasks for hosts 8119 1726773098.11052: getting the next task for host managed_node2 8119 1726773098.11061: done getting next task for host managed_node2 8119 1726773098.11066: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773098.11072: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.11074: done building task lists 8119 1726773098.11076: counting tasks in each state of execution 8119 1726773098.11080: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.11082: advancing hosts in ITERATING_TASKS 8119 1726773098.11087: starting to advance hosts 8119 1726773098.11089: getting the next task for host managed_node2 8119 1726773098.11094: done getting next task for host managed_node2 8119 1726773098.11097: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8119 1726773098.11101: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.11103: done advancing hosts to next task 8119 1726773098.11118: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773098.11122: getting variables 8119 1726773098.11124: in VariableManager get_vars() 8119 1726773098.11150: Calling all_inventory to load vars for managed_node2 8119 1726773098.11153: Calling groups_inventory to load vars for managed_node2 8119 1726773098.11155: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.11176: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11188: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.11200: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11210: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.11221: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11227: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.11237: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11254: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11268: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.11472: done with get_vars() 8119 1726773098.11485: done getting variables 8119 1726773098.11493: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.11495: done copying, going to template now 8119 1726773098.11497: done templating 8119 1726773098.11498: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.034) 0:01:32.671 **** 8119 1726773098.11519: sending task start callback 8119 1726773098.11521: entering _queue_task() for managed_node2/set_fact 8119 1726773098.11646: worker is 1 (out of 1 available) 8119 1726773098.11684: exiting _queue_task() for managed_node2/set_fact 8119 1726773098.11758: done queuing things up, now waiting for results queue to drain 8119 1726773098.11763: waiting for pending results... 11815 1726773098.11819: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11815 1726773098.11879: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee4 11815 1726773098.11926: calling self._execute() 11815 1726773098.13620: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11815 1726773098.13706: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11815 1726773098.13758: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11815 1726773098.13787: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11815 1726773098.13816: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11815 1726773098.13844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11815 1726773098.13892: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11815 1726773098.13917: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11815 1726773098.13934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11815 1726773098.14023: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11815 1726773098.14040: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11815 1726773098.14055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11815 1726773098.14303: when evaluation is False, skipping this task 11815 1726773098.14308: _execute() done 11815 1726773098.14311: dumping result to json 11815 1726773098.14312: done dumping result, returning 11815 1726773098.14317: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [12a3200b-1e9d-1dbd-cc52-000000000ee4] 11815 1726773098.14327: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee4 11815 1726773098.14354: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee4 11815 1726773098.14358: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.14522: no more pending results, returning what we have 8119 1726773098.14527: results queue empty 8119 1726773098.14529: checking for any_errors_fatal 8119 1726773098.14533: done checking for any_errors_fatal 8119 1726773098.14535: checking for max_fail_percentage 8119 1726773098.14538: done checking for max_fail_percentage 8119 1726773098.14540: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.14542: done checking to see if all hosts have failed 8119 1726773098.14544: getting the remaining hosts for this loop 8119 1726773098.14546: done getting the remaining hosts for this loop 8119 1726773098.14554: building list of next tasks for hosts 8119 1726773098.14556: getting the next task for host managed_node2 8119 1726773098.14567: done getting next task for host managed_node2 8119 1726773098.14573: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773098.14579: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.14581: done building task lists 8119 1726773098.14586: counting tasks in each state of execution 8119 1726773098.14590: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.14592: advancing hosts in ITERATING_TASKS 8119 1726773098.14595: starting to advance hosts 8119 1726773098.14597: getting the next task for host managed_node2 8119 1726773098.14604: done getting next task for host managed_node2 8119 1726773098.14607: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8119 1726773098.14613: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.14616: done advancing hosts to next task 8119 1726773098.14630: getting variables 8119 1726773098.14634: in VariableManager get_vars() 8119 1726773098.14662: Calling all_inventory to load vars for managed_node2 8119 1726773098.14665: Calling groups_inventory to load vars for managed_node2 8119 1726773098.14668: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.14690: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14701: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.14714: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14724: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.14734: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14740: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.14750: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14767: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14780: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.14989: done with get_vars() 8119 1726773098.14999: done getting variables 8119 1726773098.15004: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.15006: done copying, going to template now 8119 1726773098.15010: done templating 8119 1726773098.15011: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.035) 0:01:32.706 **** 8119 1726773098.15027: sending task start callback 8119 1726773098.15029: entering _queue_task() for managed_node2/stat 8119 1726773098.15147: worker is 1 (out of 1 available) 8119 1726773098.15182: exiting _queue_task() for managed_node2/stat 8119 1726773098.15255: done queuing things up, now waiting for results queue to drain 8119 1726773098.15260: waiting for pending results... 11817 1726773098.15314: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11817 1726773098.15373: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee6 11817 1726773098.15418: calling self._execute() 11817 1726773098.17102: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11817 1726773098.17181: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11817 1726773098.17234: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11817 1726773098.17274: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11817 1726773098.17304: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11817 1726773098.17333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11817 1726773098.17377: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11817 1726773098.17402: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11817 1726773098.17420: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11817 1726773098.17498: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11817 1726773098.17517: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11817 1726773098.17533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11817 1726773098.17776: when evaluation is False, skipping this task 11817 1726773098.17780: _execute() done 11817 1726773098.17782: dumping result to json 11817 1726773098.17787: done dumping result, returning 11817 1726773098.17791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [12a3200b-1e9d-1dbd-cc52-000000000ee6] 11817 1726773098.17799: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee6 11817 1726773098.17824: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee6 11817 1726773098.17828: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.18009: no more pending results, returning what we have 8119 1726773098.18014: results queue empty 8119 1726773098.18017: checking for any_errors_fatal 8119 1726773098.18021: done checking for any_errors_fatal 8119 1726773098.18023: checking for max_fail_percentage 8119 1726773098.18026: done checking for max_fail_percentage 8119 1726773098.18028: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.18030: done checking to see if all hosts have failed 8119 1726773098.18032: getting the remaining hosts for this loop 8119 1726773098.18034: done getting the remaining hosts for this loop 8119 1726773098.18042: building list of next tasks for hosts 8119 1726773098.18044: getting the next task for host managed_node2 8119 1726773098.18053: done getting next task for host managed_node2 8119 1726773098.18058: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773098.18063: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.18066: done building task lists 8119 1726773098.18068: counting tasks in each state of execution 8119 1726773098.18072: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.18074: advancing hosts in ITERATING_TASKS 8119 1726773098.18076: starting to advance hosts 8119 1726773098.18079: getting the next task for host managed_node2 8119 1726773098.18084: done getting next task for host managed_node2 8119 1726773098.18087: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8119 1726773098.18090: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.18092: done advancing hosts to next task 8119 1726773098.18103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773098.18106: getting variables 8119 1726773098.18108: in VariableManager get_vars() 8119 1726773098.18135: Calling all_inventory to load vars for managed_node2 8119 1726773098.18138: Calling groups_inventory to load vars for managed_node2 8119 1726773098.18140: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.18160: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18170: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.18180: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18191: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.18202: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18208: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.18219: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18236: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18250: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.18470: done with get_vars() 8119 1726773098.18480: done getting variables 8119 1726773098.18488: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.18490: done copying, going to template now 8119 1726773098.18492: done templating 8119 1726773098.18493: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.034) 0:01:32.741 **** 8119 1726773098.18510: sending task start callback 8119 1726773098.18512: entering _queue_task() for managed_node2/set_fact 8119 1726773098.18627: worker is 1 (out of 1 available) 8119 1726773098.18664: exiting _queue_task() for managed_node2/set_fact 8119 1726773098.18736: done queuing things up, now waiting for results queue to drain 8119 1726773098.18741: waiting for pending results... 11819 1726773098.18806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11819 1726773098.18870: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee7 11819 1726773098.18917: calling self._execute() 11819 1726773098.20601: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11819 1726773098.20705: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11819 1726773098.20756: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11819 1726773098.20781: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11819 1726773098.20817: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11819 1726773098.20846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11819 1726773098.20887: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11819 1726773098.20913: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11819 1726773098.20934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11819 1726773098.21015: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11819 1726773098.21047: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11819 1726773098.21077: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11819 1726773098.21488: when evaluation is False, skipping this task 11819 1726773098.21495: _execute() done 11819 1726773098.21498: dumping result to json 11819 1726773098.21503: done dumping result, returning 11819 1726773098.21519: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [12a3200b-1e9d-1dbd-cc52-000000000ee7] 11819 1726773098.21536: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee7 11819 1726773098.21575: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee7 11819 1726773098.21580: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773098.21929: no more pending results, returning what we have 8119 1726773098.21935: results queue empty 8119 1726773098.21937: checking for any_errors_fatal 8119 1726773098.21943: done checking for any_errors_fatal 8119 1726773098.21945: checking for max_fail_percentage 8119 1726773098.21949: done checking for max_fail_percentage 8119 1726773098.21951: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.21953: done checking to see if all hosts have failed 8119 1726773098.21961: getting the remaining hosts for this loop 8119 1726773098.21967: done getting the remaining hosts for this loop 8119 1726773098.21976: building list of next tasks for hosts 8119 1726773098.21985: getting the next task for host managed_node2 8119 1726773098.22006: done getting next task for host managed_node2 8119 1726773098.22015: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773098.22022: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.22026: done building task lists 8119 1726773098.22028: counting tasks in each state of execution 8119 1726773098.22035: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.22038: advancing hosts in ITERATING_TASKS 8119 1726773098.22041: starting to advance hosts 8119 1726773098.22044: getting the next task for host managed_node2 8119 1726773098.22052: done getting next task for host managed_node2 8119 1726773098.22056: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8119 1726773098.22061: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.22064: done advancing hosts to next task 8119 1726773098.22080: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773098.22088: getting variables 8119 1726773098.22093: in VariableManager get_vars() 8119 1726773098.22164: Calling all_inventory to load vars for managed_node2 8119 1726773098.22172: Calling groups_inventory to load vars for managed_node2 8119 1726773098.22176: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.22218: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22245: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.22275: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22293: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.22315: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22328: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.22346: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22378: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22407: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.22828: done with get_vars() 8119 1726773098.22843: done getting variables 8119 1726773098.22853: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.22856: done copying, going to template now 8119 1726773098.22859: done templating 8119 1726773098.22861: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.043) 0:01:32.785 **** 8119 1726773098.22897: sending task start callback 8119 1726773098.22901: entering _queue_task() for managed_node2/include_vars 8119 1726773098.23091: worker is 1 (out of 1 available) 8119 1726773098.23133: exiting _queue_task() for managed_node2/include_vars 8119 1726773098.23220: done queuing things up, now waiting for results queue to drain 8119 1726773098.23234: waiting for pending results... 11821 1726773098.23506: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11821 1726773098.23616: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000ee9 11821 1726773098.23679: calling self._execute() 11821 1726773098.26310: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11821 1726773098.26422: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11821 1726773098.26473: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11821 1726773098.26514: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11821 1726773098.26556: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11821 1726773098.26589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11821 1726773098.26657: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11821 1726773098.26686: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11821 1726773098.26706: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11821 1726773098.26805: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11821 1726773098.26826: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11821 1726773098.26840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11821 1726773098.27766: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup 11821 1726773098.27945: Loaded config def from plugin (lookup/first_found) 11821 1726773098.27950: Loading LookupModule 'first_found' from /usr/local/lib/python3.9/site-packages/ansible/plugins/lookup/first_found.py 11821 1726773098.28005: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11821 1726773098.28041: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11821 1726773098.28050: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11821 1726773098.28060: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11821 1726773098.28065: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11821 1726773098.28171: Loading ActionModule 'include_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11821 1726773098.28189: starting attempt loop 11821 1726773098.28191: running the handler 11821 1726773098.28234: handler run complete 11821 1726773098.28239: attempt loop complete, returning result 11821 1726773098.28241: _execute() done 11821 1726773098.28243: dumping result to json 11821 1726773098.28246: done dumping result, returning 11821 1726773098.28250: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [12a3200b-1e9d-1dbd-cc52-000000000ee9] 11821 1726773098.28257: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee9 11821 1726773098.28285: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000ee9 11821 1726773098.28317: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8119 1726773098.28490: no more pending results, returning what we have 8119 1726773098.28496: results queue empty 8119 1726773098.28498: checking for any_errors_fatal 8119 1726773098.28503: done checking for any_errors_fatal 8119 1726773098.28505: checking for max_fail_percentage 8119 1726773098.28508: done checking for max_fail_percentage 8119 1726773098.28510: checking to see if all hosts have failed and the running result is not ok 8119 1726773098.28512: done checking to see if all hosts have failed 8119 1726773098.28514: getting the remaining hosts for this loop 8119 1726773098.28517: done getting the remaining hosts for this loop 8119 1726773098.28524: building list of next tasks for hosts 8119 1726773098.28527: getting the next task for host managed_node2 8119 1726773098.28536: done getting next task for host managed_node2 8119 1726773098.28540: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773098.28544: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.28547: done building task lists 8119 1726773098.28549: counting tasks in each state of execution 8119 1726773098.28553: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773098.28555: advancing hosts in ITERATING_TASKS 8119 1726773098.28557: starting to advance hosts 8119 1726773098.28560: getting the next task for host managed_node2 8119 1726773098.28565: done getting next task for host managed_node2 8119 1726773098.28568: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8119 1726773098.28570: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773098.28572: done advancing hosts to next task 8119 1726773098.28585: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773098.28589: getting variables 8119 1726773098.28591: in VariableManager get_vars() 8119 1726773098.28619: Calling all_inventory to load vars for managed_node2 8119 1726773098.28622: Calling groups_inventory to load vars for managed_node2 8119 1726773098.28625: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773098.28645: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28655: Calling all_plugins_play to load vars for managed_node2 8119 1726773098.28665: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28673: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773098.28685: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28693: Calling groups_plugins_play to load vars for managed_node2 8119 1726773098.28703: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28723: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28737: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773098.28938: done with get_vars() 8119 1726773098.28949: done getting variables 8119 1726773098.28954: sending task start callback, copying the task so we can template it temporarily 8119 1726773098.28956: done copying, going to template now 8119 1726773098.28958: done templating 8119 1726773098.28959: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.060) 0:01:32.846 **** 8119 1726773098.28975: sending task start callback 8119 1726773098.28977: entering _queue_task() for managed_node2/package 8119 1726773098.29105: worker is 1 (out of 1 available) 8119 1726773098.29142: exiting _queue_task() for managed_node2/package 8119 1726773098.29212: done queuing things up, now waiting for results queue to drain 8119 1726773098.29217: waiting for pending results... 11823 1726773098.29298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11823 1726773098.29370: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e5a 11823 1726773098.29421: calling self._execute() 11823 1726773098.31819: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11823 1726773098.31940: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11823 1726773098.32017: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11823 1726773098.32059: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11823 1726773098.32103: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11823 1726773098.32147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11823 1726773098.32213: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11823 1726773098.32262: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11823 1726773098.32290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11823 1726773098.32411: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11823 1726773098.32437: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11823 1726773098.32459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11823 1726773098.32665: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11823 1726773098.32671: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11823 1726773098.32675: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11823 1726773098.32678: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11823 1726773098.32681: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11823 1726773098.32687: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11823 1726773098.32690: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11823 1726773098.32693: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11823 1726773098.32696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11823 1726773098.32724: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11823 1726773098.32729: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11823 1726773098.32733: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11823 1726773098.32984: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11823 1726773098.33038: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11823 1726773098.33053: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11823 1726773098.33070: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11823 1726773098.33078: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11823 1726773098.33219: Loading ActionModule 'package' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11823 1726773098.33243: starting attempt loop 11823 1726773098.33247: running the handler 11823 1726773098.33431: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale 11823 1726773098.33446: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity 11823 1726773098.33456: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap 11823 1726773098.33469: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox 11823 1726773098.33485: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios 11823 1726773098.33521: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/__pycache__ 11823 1726773098.33543: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/basics/__pycache__ 11823 1726773098.33553: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/exoscale/__pycache__ 11823 1726773098.33560: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/infinity/__pycache__ 11823 1726773098.33567: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/ldap/__pycache__ 11823 1726773098.33575: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/netbox/__pycache__ 11823 1726773098.33588: trying /usr/local/lib/python3.9/site-packages/ansible/modules/net_tools/nios/__pycache__ 11823 1726773098.33613: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10 11823 1726773098.33627: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci 11823 1726773098.33798: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos 11823 1726773098.33814: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos 11823 1726773098.33839: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba 11823 1726773098.33849: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa 11823 1726773098.33862: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi 11823 1726773098.33967: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch 11823 1726773098.33981: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point 11823 1726773098.34142: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix 11823 1726773098.34152: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli 11823 1726773098.34162: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine 11823 1726773098.34269: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision 11823 1726773098.34279: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos 11823 1726773098.34331: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus 11823 1726773098.34351: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10 11823 1726773098.34363: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6 11823 1726773098.34375: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9 11823 1726773098.34388: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos 11823 1726773098.34400: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch 11823 1726773098.34414: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos 11823 1726773098.34426: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos 11823 1726773098.34470: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli 11823 1726773098.34480: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos 11823 1726773098.34497: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5 11823 1726773098.34766: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files 11823 1726773098.34777: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer 11823 1726773098.34788: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager 11823 1726773098.34838: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios 11823 1726773098.35551: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr 11823 1726773098.35564: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd 11823 1726773098.35577: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx 11823 1726773098.35850: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos 11823 1726773098.35875: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate 11823 1726773098.35888: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface 11823 1726773098.35900: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios 11823 1726773098.35948: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr 11823 1726773098.35982: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware 11823 1726773098.35996: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential 11823 1726773098.36006: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos 11823 1726773098.36059: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2 11823 1726773098.36070: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3 11823 1726773098.36079: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki 11823 1726773098.36116: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact 11823 1726773098.36126: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf 11823 1726773098.36137: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler 11823 1726773098.36164: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor 11823 1726773098.36249: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos 11823 1726773098.36262: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso 11823 1726773098.36276: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage 11823 1726773098.36286: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos 11823 1726773098.36421: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx 11823 1726773098.36466: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx 11823 1726773098.36475: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance 11823 1726773098.36488: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs 11823 1726773098.36499: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos 11823 1726773098.36544: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol 11823 1726773098.36554: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware 11823 1726773098.36565: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf 11823 1726773098.36574: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros 11823 1726773098.36586: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing 11823 1726773098.36594: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive 11823 1726773098.36605: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos 11823 1726773098.36628: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros 11823 1726773098.36640: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system 11823 1726773098.36653: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss 11823 1726773098.36663: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos 11823 1726773098.36701: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/__pycache__ 11823 1726773098.36712: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/a10/__pycache__ 11823 1726773098.36724: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aci/__pycache__ 11823 1726773098.36831: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aireos/__pycache__ 11823 1726773098.36841: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aos/__pycache__ 11823 1726773098.36859: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/aruba/__pycache__ 11823 1726773098.36867: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/asa/__pycache__ 11823 1726773098.36876: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/avi/__pycache__ 11823 1726773098.36947: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/bigswitch/__pycache__ 11823 1726773098.36958: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/check_point/__pycache__ 11823 1726773098.37059: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/citrix/__pycache__ 11823 1726773098.37068: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cli/__pycache__ 11823 1726773098.37077: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudengine/__pycache__ 11823 1726773098.37148: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cloudvision/__pycache__ 11823 1726773098.37157: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cnos/__pycache__ 11823 1726773098.37189: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/cumulus/__pycache__ 11823 1726773098.37204: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos10/__pycache__ 11823 1726773098.37214: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos6/__pycache__ 11823 1726773098.37223: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/dellos9/__pycache__ 11823 1726773098.37232: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeos/__pycache__ 11823 1726773098.37240: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/edgeswitch/__pycache__ 11823 1726773098.37248: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/enos/__pycache__ 11823 1726773098.37256: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eos/__pycache__ 11823 1726773098.37288: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/eric_eccli/__pycache__ 11823 1726773098.37297: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/exos/__pycache__ 11823 1726773098.37306: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/f5/__pycache__ 11823 1726773098.37474: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/files/__pycache__ 11823 1726773098.37486: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortianalyzer/__pycache__ 11823 1726773098.37494: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortimanager/__pycache__ 11823 1726773098.37528: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/fortios/__pycache__ 11823 1726773098.37946: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/frr/__pycache__ 11823 1726773098.37957: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ftd/__pycache__ 11823 1726773098.37967: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/icx/__pycache__ 11823 1726773098.37990: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/illumos/__pycache__ 11823 1726773098.38010: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ingate/__pycache__ 11823 1726773098.38019: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/interface/__pycache__ 11823 1726773098.38028: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ios/__pycache__ 11823 1726773098.38058: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/iosxr/__pycache__ 11823 1726773098.38082: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ironware/__pycache__ 11823 1726773098.38096: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/itential/__pycache__ 11823 1726773098.38103: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/junos/__pycache__ 11823 1726773098.38140: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer2/__pycache__ 11823 1726773098.38149: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/layer3/__pycache__ 11823 1726773098.38156: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/meraki/__pycache__ 11823 1726773098.38180: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netact/__pycache__ 11823 1726773098.38190: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netconf/__pycache__ 11823 1726773098.38199: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netscaler/__pycache__ 11823 1726773098.38220: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/netvisor/__pycache__ 11823 1726773098.38273: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nos/__pycache__ 11823 1726773098.38286: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nso/__pycache__ 11823 1726773098.38296: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nuage/__pycache__ 11823 1726773098.38303: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/nxos/__pycache__ 11823 1726773098.38390: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/onyx/__pycache__ 11823 1726773098.38428: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/opx/__pycache__ 11823 1726773098.38436: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ordnance/__pycache__ 11823 1726773098.38444: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/ovs/__pycache__ 11823 1726773098.38453: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/panos/__pycache__ 11823 1726773098.38480: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/protocol/__pycache__ 11823 1726773098.38493: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/radware/__pycache__ 11823 1726773098.38502: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/restconf/__pycache__ 11823 1726773098.38512: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routeros/__pycache__ 11823 1726773098.38520: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/routing/__pycache__ 11823 1726773098.38528: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/skydive/__pycache__ 11823 1726773098.38536: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/slxos/__pycache__ 11823 1726773098.38551: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/sros/__pycache__ 11823 1726773098.38560: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/system/__pycache__ 11823 1726773098.38570: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/voss/__pycache__ 11823 1726773098.38578: trying /usr/local/lib/python3.9/site-packages/ansible/modules/network/vyos/__pycache__ 11823 1726773098.38604: trying /usr/local/lib/python3.9/site-packages/ansible/modules/notification/__pycache__ 11823 1726773098.38642: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/language 11823 1726773098.38668: trying /usr/local/lib/python3.9/site-packages/ansible/modules/packaging/os 11823 1726773098.38759: _low_level_execute_command(): starting 11823 1726773098.38768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11823 1726773098.41836: stdout chunk (state=2): >>>/root <<< 11823 1726773098.41986: stderr chunk (state=3): >>><<< 11823 1726773098.41995: stdout chunk (state=3): >>><<< 11823 1726773098.42026: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11823 1726773098.42045: _low_level_execute_command(): starting 11823 1726773098.42054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988 `" && echo ansible-tmp-1726773098.4203827-11823-131083801017988="` echo /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988 `" ) && sleep 0' 11823 1726773098.45313: stdout chunk (state=2): >>>ansible-tmp-1726773098.4203827-11823-131083801017988=/root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988 <<< 11823 1726773098.45469: stderr chunk (state=3): >>><<< 11823 1726773098.45477: stdout chunk (state=3): >>><<< 11823 1726773098.45504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773098.4203827-11823-131083801017988=/root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988 , stderr= 11823 1726773098.45659: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/dnf-ZIP_DEFLATED 11823 1726773098.45745: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/AnsiballZ_dnf.py 11823 1726773098.46557: Sending initial data 11823 1726773098.46571: Sent initial data (151 bytes) 11823 1726773098.49011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmphelrzqe_ /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/AnsiballZ_dnf.py <<< 11823 1726773098.50556: stderr chunk (state=3): >>><<< 11823 1726773098.50564: stdout chunk (state=3): >>><<< 11823 1726773098.50598: done transferring module to remote 11823 1726773098.50619: _low_level_execute_command(): starting 11823 1726773098.50627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/ /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/AnsiballZ_dnf.py && sleep 0' 11823 1726773098.53486: stderr chunk (state=2): >>><<< 11823 1726773098.53503: stdout chunk (state=2): >>><<< 11823 1726773098.53529: _low_level_execute_command() done: rc=0, stdout=, stderr= 11823 1726773098.53535: _low_level_execute_command(): starting 11823 1726773098.53544: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/AnsiballZ_dnf.py && sleep 0' 11823 1726773101.07435: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11823 1726773101.10648: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11823 1726773101.10696: stderr chunk (state=3): >>><<< 11823 1726773101.10702: stdout chunk (state=3): >>><<< 11823 1726773101.10724: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11823 1726773101.10763: done with _execute_module (dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11823 1726773101.10773: _low_level_execute_command(): starting 11823 1726773101.10778: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773098.4203827-11823-131083801017988/ > /dev/null 2>&1 && sleep 0' 11823 1726773101.13549: stderr chunk (state=2): >>><<< 11823 1726773101.13564: stdout chunk (state=2): >>><<< 11823 1726773101.13585: _low_level_execute_command() done: rc=0, stdout=, stderr= 11823 1726773101.13595: handler run complete 11823 1726773101.13633: attempt loop complete, returning result 11823 1726773101.13650: _execute() done 11823 1726773101.13651: dumping result to json 11823 1726773101.13655: done dumping result, returning 11823 1726773101.13672: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [12a3200b-1e9d-1dbd-cc52-000000000e5a] 11823 1726773101.13688: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5a 11823 1726773101.13728: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5a 11823 1726773101.13790: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8119 1726773101.13927: no more pending results, returning what we have 8119 1726773101.13933: results queue empty 8119 1726773101.13935: checking for any_errors_fatal 8119 1726773101.13941: done checking for any_errors_fatal 8119 1726773101.13943: checking for max_fail_percentage 8119 1726773101.13946: done checking for max_fail_percentage 8119 1726773101.13948: checking to see if all hosts have failed and the running result is not ok 8119 1726773101.13949: done checking to see if all hosts have failed 8119 1726773101.13952: getting the remaining hosts for this loop 8119 1726773101.13954: done getting the remaining hosts for this loop 8119 1726773101.13962: building list of next tasks for hosts 8119 1726773101.13965: getting the next task for host managed_node2 8119 1726773101.13974: done getting next task for host managed_node2 8119 1726773101.13978: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773101.13985: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.13988: done building task lists 8119 1726773101.13990: counting tasks in each state of execution 8119 1726773101.13994: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773101.13996: advancing hosts in ITERATING_TASKS 8119 1726773101.13998: starting to advance hosts 8119 1726773101.14000: getting the next task for host managed_node2 8119 1726773101.14006: done getting next task for host managed_node2 8119 1726773101.14011: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8119 1726773101.14015: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.14017: done advancing hosts to next task 8119 1726773101.14033: Loading ActionModule 'debug' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773101.14037: getting variables 8119 1726773101.14040: in VariableManager get_vars() 8119 1726773101.14072: Calling all_inventory to load vars for managed_node2 8119 1726773101.14077: Calling groups_inventory to load vars for managed_node2 8119 1726773101.14080: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773101.14104: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14118: Calling all_plugins_play to load vars for managed_node2 8119 1726773101.14129: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14138: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773101.14148: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14154: Calling groups_plugins_play to load vars for managed_node2 8119 1726773101.14164: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14181: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14199: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.14414: done with get_vars() 8119 1726773101.14424: done getting variables 8119 1726773101.14429: sending task start callback, copying the task so we can template it temporarily 8119 1726773101.14431: done copying, going to template now 8119 1726773101.14433: done templating 8119 1726773101.14434: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:41 -0400 (0:00:02.854) 0:01:35.700 **** 8119 1726773101.14452: sending task start callback 8119 1726773101.14454: entering _queue_task() for managed_node2/debug 8119 1726773101.14587: worker is 1 (out of 1 available) 8119 1726773101.14626: exiting _queue_task() for managed_node2/debug 8119 1726773101.14698: done queuing things up, now waiting for results queue to drain 8119 1726773101.14703: waiting for pending results... 11891 1726773101.14769: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11891 1726773101.14827: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e5c 11891 1726773101.14871: calling self._execute() 11891 1726773101.16573: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11891 1726773101.16666: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11891 1726773101.16719: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11891 1726773101.16746: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11891 1726773101.16775: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11891 1726773101.16811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11891 1726773101.16852: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11891 1726773101.16875: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11891 1726773101.16897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11891 1726773101.16977: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11891 1726773101.16999: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11891 1726773101.17018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11891 1726773101.17285: when evaluation is False, skipping this task 11891 1726773101.17290: _execute() done 11891 1726773101.17292: dumping result to json 11891 1726773101.17293: done dumping result, returning 11891 1726773101.17297: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-000000000e5c] 11891 1726773101.17306: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5c 11891 1726773101.17337: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5c 11891 1726773101.17341: WORKER PROCESS EXITING skipping: [managed_node2] => {} 8119 1726773101.17533: no more pending results, returning what we have 8119 1726773101.17538: results queue empty 8119 1726773101.17540: checking for any_errors_fatal 8119 1726773101.17547: done checking for any_errors_fatal 8119 1726773101.17549: checking for max_fail_percentage 8119 1726773101.17552: done checking for max_fail_percentage 8119 1726773101.17554: checking to see if all hosts have failed and the running result is not ok 8119 1726773101.17556: done checking to see if all hosts have failed 8119 1726773101.17557: getting the remaining hosts for this loop 8119 1726773101.17560: done getting the remaining hosts for this loop 8119 1726773101.17568: building list of next tasks for hosts 8119 1726773101.17570: getting the next task for host managed_node2 8119 1726773101.17578: done getting next task for host managed_node2 8119 1726773101.17585: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773101.17590: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.17593: done building task lists 8119 1726773101.17595: counting tasks in each state of execution 8119 1726773101.17599: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773101.17601: advancing hosts in ITERATING_TASKS 8119 1726773101.17603: starting to advance hosts 8119 1726773101.17605: getting the next task for host managed_node2 8119 1726773101.17610: done getting next task for host managed_node2 8119 1726773101.17613: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8119 1726773101.17617: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.17619: done advancing hosts to next task 8119 1726773101.17631: Loading ActionModule 'reboot' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773101.17634: getting variables 8119 1726773101.17637: in VariableManager get_vars() 8119 1726773101.17664: Calling all_inventory to load vars for managed_node2 8119 1726773101.17667: Calling groups_inventory to load vars for managed_node2 8119 1726773101.17669: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773101.17693: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17704: Calling all_plugins_play to load vars for managed_node2 8119 1726773101.17716: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17725: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773101.17735: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17741: Calling groups_plugins_play to load vars for managed_node2 8119 1726773101.17750: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17770: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17787: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.17989: done with get_vars() 8119 1726773101.17999: done getting variables 8119 1726773101.18004: sending task start callback, copying the task so we can template it temporarily 8119 1726773101.18005: done copying, going to template now 8119 1726773101.18007: done templating 8119 1726773101.18009: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:41 -0400 (0:00:00.035) 0:01:35.736 **** 8119 1726773101.18025: sending task start callback 8119 1726773101.18027: entering _queue_task() for managed_node2/reboot 8119 1726773101.18153: worker is 1 (out of 1 available) 8119 1726773101.18191: exiting _queue_task() for managed_node2/reboot 8119 1726773101.18264: done queuing things up, now waiting for results queue to drain 8119 1726773101.18269: waiting for pending results... 11893 1726773101.18335: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11893 1726773101.18391: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e5d 11893 1726773101.18440: calling self._execute() 11893 1726773101.20178: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11893 1726773101.20266: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11893 1726773101.20322: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11893 1726773101.20354: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11893 1726773101.20395: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11893 1726773101.20428: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11893 1726773101.20470: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11893 1726773101.20498: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11893 1726773101.20521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11893 1726773101.20599: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11893 1726773101.20622: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11893 1726773101.20638: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11893 1726773101.20915: when evaluation is False, skipping this task 11893 1726773101.20919: _execute() done 11893 1726773101.20921: dumping result to json 11893 1726773101.20923: done dumping result, returning 11893 1726773101.20927: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [12a3200b-1e9d-1dbd-cc52-000000000e5d] 11893 1726773101.20936: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5d 11893 1726773101.20962: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5d 11893 1726773101.20966: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773101.21149: no more pending results, returning what we have 8119 1726773101.21154: results queue empty 8119 1726773101.21156: checking for any_errors_fatal 8119 1726773101.21161: done checking for any_errors_fatal 8119 1726773101.21163: checking for max_fail_percentage 8119 1726773101.21166: done checking for max_fail_percentage 8119 1726773101.21168: checking to see if all hosts have failed and the running result is not ok 8119 1726773101.21170: done checking to see if all hosts have failed 8119 1726773101.21172: getting the remaining hosts for this loop 8119 1726773101.21174: done getting the remaining hosts for this loop 8119 1726773101.21184: building list of next tasks for hosts 8119 1726773101.21187: getting the next task for host managed_node2 8119 1726773101.21195: done getting next task for host managed_node2 8119 1726773101.21201: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773101.21205: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.21208: done building task lists 8119 1726773101.21210: counting tasks in each state of execution 8119 1726773101.21214: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773101.21216: advancing hosts in ITERATING_TASKS 8119 1726773101.21219: starting to advance hosts 8119 1726773101.21221: getting the next task for host managed_node2 8119 1726773101.21225: done getting next task for host managed_node2 8119 1726773101.21228: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8119 1726773101.21232: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.21234: done advancing hosts to next task 8119 1726773101.21249: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773101.21253: getting variables 8119 1726773101.21256: in VariableManager get_vars() 8119 1726773101.21287: Calling all_inventory to load vars for managed_node2 8119 1726773101.21291: Calling groups_inventory to load vars for managed_node2 8119 1726773101.21294: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773101.21316: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21327: Calling all_plugins_play to load vars for managed_node2 8119 1726773101.21337: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21346: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773101.21356: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21362: Calling groups_plugins_play to load vars for managed_node2 8119 1726773101.21372: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21392: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21407: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.21614: done with get_vars() 8119 1726773101.21625: done getting variables 8119 1726773101.21630: sending task start callback, copying the task so we can template it temporarily 8119 1726773101.21631: done copying, going to template now 8119 1726773101.21633: done templating 8119 1726773101.21634: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:41 -0400 (0:00:00.036) 0:01:35.772 **** 8119 1726773101.21651: sending task start callback 8119 1726773101.21653: entering _queue_task() for managed_node2/fail 8119 1726773101.21775: worker is 1 (out of 1 available) 8119 1726773101.21813: exiting _queue_task() for managed_node2/fail 8119 1726773101.21885: done queuing things up, now waiting for results queue to drain 8119 1726773101.21891: waiting for pending results... 11895 1726773101.21953: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11895 1726773101.22012: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e5e 11895 1726773101.22058: calling self._execute() 11895 1726773101.23773: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11895 1726773101.23852: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11895 1726773101.23904: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11895 1726773101.23934: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11895 1726773101.23959: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11895 1726773101.23991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11895 1726773101.24035: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11895 1726773101.24057: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11895 1726773101.24087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11895 1726773101.24166: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11895 1726773101.24182: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11895 1726773101.24198: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11895 1726773101.24452: when evaluation is False, skipping this task 11895 1726773101.24456: _execute() done 11895 1726773101.24458: dumping result to json 11895 1726773101.24460: done dumping result, returning 11895 1726773101.24464: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [12a3200b-1e9d-1dbd-cc52-000000000e5e] 11895 1726773101.24472: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5e 11895 1726773101.24501: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e5e 11895 1726773101.24527: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773101.24640: no more pending results, returning what we have 8119 1726773101.24644: results queue empty 8119 1726773101.24646: checking for any_errors_fatal 8119 1726773101.24651: done checking for any_errors_fatal 8119 1726773101.24653: checking for max_fail_percentage 8119 1726773101.24656: done checking for max_fail_percentage 8119 1726773101.24658: checking to see if all hosts have failed and the running result is not ok 8119 1726773101.24660: done checking to see if all hosts have failed 8119 1726773101.24661: getting the remaining hosts for this loop 8119 1726773101.24664: done getting the remaining hosts for this loop 8119 1726773101.24671: building list of next tasks for hosts 8119 1726773101.24674: getting the next task for host managed_node2 8119 1726773101.24686: done getting next task for host managed_node2 8119 1726773101.24692: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773101.24697: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.24699: done building task lists 8119 1726773101.24701: counting tasks in each state of execution 8119 1726773101.24705: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773101.24710: advancing hosts in ITERATING_TASKS 8119 1726773101.24712: starting to advance hosts 8119 1726773101.24715: getting the next task for host managed_node2 8119 1726773101.24722: done getting next task for host managed_node2 8119 1726773101.24725: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8119 1726773101.24729: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.24731: done advancing hosts to next task 8119 1726773101.24774: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773101.24779: getting variables 8119 1726773101.24781: in VariableManager get_vars() 8119 1726773101.24813: Calling all_inventory to load vars for managed_node2 8119 1726773101.24817: Calling groups_inventory to load vars for managed_node2 8119 1726773101.24819: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773101.24839: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.24849: Calling all_plugins_play to load vars for managed_node2 8119 1726773101.24859: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.24867: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773101.24877: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.24885: Calling groups_plugins_play to load vars for managed_node2 8119 1726773101.24897: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.24917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.24931: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.25139: done with get_vars() 8119 1726773101.25150: done getting variables 8119 1726773101.25154: sending task start callback, copying the task so we can template it temporarily 8119 1726773101.25155: done copying, going to template now 8119 1726773101.25157: done templating 8119 1726773101.25159: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:41 -0400 (0:00:00.035) 0:01:35.808 **** 8119 1726773101.25175: sending task start callback 8119 1726773101.25176: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773101.25296: worker is 1 (out of 1 available) 8119 1726773101.25336: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773101.25406: done queuing things up, now waiting for results queue to drain 8119 1726773101.25414: waiting for pending results... 11897 1726773101.25467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11897 1726773101.25523: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e60 11897 1726773101.25568: calling self._execute() 11897 1726773101.27265: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11897 1726773101.27345: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11897 1726773101.27407: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11897 1726773101.27435: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11897 1726773101.27461: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11897 1726773101.27490: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11897 1726773101.27535: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11897 1726773101.27557: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11897 1726773101.27574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11897 1726773101.27653: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11897 1726773101.27670: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11897 1726773101.27685: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11897 1726773101.27903: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11897 1726773101.27936: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11897 1726773101.27947: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11897 1726773101.27957: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11897 1726773101.27962: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11897 1726773101.28044: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11897 1726773101.28057: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11897 1726773101.28079: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11897 1726773101.28099: starting attempt loop 11897 1726773101.28103: running the handler 11897 1726773101.28112: _low_level_execute_command(): starting 11897 1726773101.28117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11897 1726773101.30626: stdout chunk (state=2): >>>/root <<< 11897 1726773101.30844: stderr chunk (state=3): >>><<< 11897 1726773101.30850: stdout chunk (state=3): >>><<< 11897 1726773101.30869: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11897 1726773101.30885: _low_level_execute_command(): starting 11897 1726773101.30891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299 `" && echo ansible-tmp-1726773101.3087716-11897-278474205314299="` echo /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299 `" ) && sleep 0' 11897 1726773101.33620: stdout chunk (state=2): >>>ansible-tmp-1726773101.3087716-11897-278474205314299=/root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299 <<< 11897 1726773101.33821: stderr chunk (state=3): >>><<< 11897 1726773101.33828: stdout chunk (state=3): >>><<< 11897 1726773101.33845: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773101.3087716-11897-278474205314299=/root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299 , stderr= 11897 1726773101.33931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 11897 1726773101.33988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/AnsiballZ_kernel_settings_get_config.py 11897 1726773101.34316: Sending initial data 11897 1726773101.34331: Sent initial data (174 bytes) 11897 1726773101.36769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpd3b859cd /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/AnsiballZ_kernel_settings_get_config.py <<< 11897 1726773101.37743: stderr chunk (state=3): >>><<< 11897 1726773101.37749: stdout chunk (state=3): >>><<< 11897 1726773101.37771: done transferring module to remote 11897 1726773101.37787: _low_level_execute_command(): starting 11897 1726773101.37792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/ /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11897 1726773101.40395: stderr chunk (state=2): >>><<< 11897 1726773101.40408: stdout chunk (state=2): >>><<< 11897 1726773101.40426: _low_level_execute_command() done: rc=0, stdout=, stderr= 11897 1726773101.40429: _low_level_execute_command(): starting 11897 1726773101.40435: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11897 1726773101.55929: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11897 1726773101.56941: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11897 1726773101.56995: stderr chunk (state=3): >>><<< 11897 1726773101.57002: stdout chunk (state=3): >>><<< 11897 1726773101.57025: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 11897 1726773101.57053: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11897 1726773101.57064: _low_level_execute_command(): starting 11897 1726773101.57069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773101.3087716-11897-278474205314299/ > /dev/null 2>&1 && sleep 0' 11897 1726773101.59739: stderr chunk (state=2): >>><<< 11897 1726773101.59754: stdout chunk (state=2): >>><<< 11897 1726773101.59774: _low_level_execute_command() done: rc=0, stdout=, stderr= 11897 1726773101.59780: handler run complete 11897 1726773101.59810: attempt loop complete, returning result 11897 1726773101.59824: _execute() done 11897 1726773101.59826: dumping result to json 11897 1726773101.59829: done dumping result, returning 11897 1726773101.59842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [12a3200b-1e9d-1dbd-cc52-000000000e60] 11897 1726773101.59855: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e60 11897 1726773101.59929: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e60 11897 1726773101.59968: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8119 1726773101.60108: no more pending results, returning what we have 8119 1726773101.60114: results queue empty 8119 1726773101.60116: checking for any_errors_fatal 8119 1726773101.60120: done checking for any_errors_fatal 8119 1726773101.60122: checking for max_fail_percentage 8119 1726773101.60125: done checking for max_fail_percentage 8119 1726773101.60127: checking to see if all hosts have failed and the running result is not ok 8119 1726773101.60129: done checking to see if all hosts have failed 8119 1726773101.60131: getting the remaining hosts for this loop 8119 1726773101.60133: done getting the remaining hosts for this loop 8119 1726773101.60141: building list of next tasks for hosts 8119 1726773101.60143: getting the next task for host managed_node2 8119 1726773101.60151: done getting next task for host managed_node2 8119 1726773101.60155: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773101.60160: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.60162: done building task lists 8119 1726773101.60164: counting tasks in each state of execution 8119 1726773101.60168: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773101.60170: advancing hosts in ITERATING_TASKS 8119 1726773101.60172: starting to advance hosts 8119 1726773101.60174: getting the next task for host managed_node2 8119 1726773101.60179: done getting next task for host managed_node2 8119 1726773101.60182: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8119 1726773101.60187: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773101.60190: done advancing hosts to next task 8119 1726773101.60205: getting variables 8119 1726773101.60208: in VariableManager get_vars() 8119 1726773101.60243: Calling all_inventory to load vars for managed_node2 8119 1726773101.60249: Calling groups_inventory to load vars for managed_node2 8119 1726773101.60252: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773101.60277: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60291: Calling all_plugins_play to load vars for managed_node2 8119 1726773101.60302: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60312: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773101.60324: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60330: Calling groups_plugins_play to load vars for managed_node2 8119 1726773101.60339: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60356: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60369: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773101.60572: done with get_vars() 8119 1726773101.60584: done getting variables 8119 1726773101.60590: sending task start callback, copying the task so we can template it temporarily 8119 1726773101.60592: done copying, going to template now 8119 1726773101.60594: done templating 8119 1726773101.60595: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:41 -0400 (0:00:00.354) 0:01:36.162 **** 8119 1726773101.60612: sending task start callback 8119 1726773101.60614: entering _queue_task() for managed_node2/stat 8119 1726773101.60744: worker is 1 (out of 1 available) 8119 1726773101.60785: exiting _queue_task() for managed_node2/stat 8119 1726773101.60855: done queuing things up, now waiting for results queue to drain 8119 1726773101.60860: waiting for pending results... 11906 1726773101.60930: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11906 1726773101.60990: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e61 11906 1726773101.62699: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11906 1726773101.62786: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11906 1726773101.62836: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11906 1726773101.62865: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11906 1726773101.62895: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11906 1726773101.62926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11906 1726773101.62970: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11906 1726773101.62996: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11906 1726773101.63017: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11906 1726773101.63110: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11906 1726773101.63129: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11906 1726773101.63143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11906 1726773101.63493: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11906 1726773101.63497: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11906 1726773101.63500: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11906 1726773101.63502: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11906 1726773101.63503: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11906 1726773101.63505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.63507: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11906 1726773101.63511: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11906 1726773101.63513: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11906 1726773101.63530: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11906 1726773101.63534: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11906 1726773101.63536: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.63805: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11906 1726773101.63812: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11906 1726773101.63814: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11906 1726773101.63816: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11906 1726773101.63818: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11906 1726773101.63820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.63821: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11906 1726773101.63824: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11906 1726773101.63825: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11906 1726773101.63843: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11906 1726773101.63846: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11906 1726773101.63848: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.64025: when evaluation is False, skipping this task 11906 1726773101.64059: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11906 1726773101.64063: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11906 1726773101.64065: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11906 1726773101.64067: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11906 1726773101.64068: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11906 1726773101.64070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.64072: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11906 1726773101.64073: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11906 1726773101.64075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11906 1726773101.64094: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11906 1726773101.64099: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11906 1726773101.64102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.64297: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11906 1726773101.64301: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11906 1726773101.64303: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11906 1726773101.64305: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11906 1726773101.64307: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11906 1726773101.64311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.64314: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11906 1726773101.64316: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11906 1726773101.64318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11906 1726773101.64345: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11906 1726773101.64350: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11906 1726773101.64353: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "item": "", "skip_reason": "Conditional result was False" } 11906 1726773101.64625: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11906 1726773101.64661: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11906 1726773101.64672: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11906 1726773101.64682: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11906 1726773101.64689: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11906 1726773101.64772: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11906 1726773101.64788: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11906 1726773101.64815: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11906 1726773101.64831: starting attempt loop 11906 1726773101.64833: running the handler 11906 1726773101.64841: _low_level_execute_command(): starting 11906 1726773101.64844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11906 1726773101.67428: stdout chunk (state=2): >>>/root <<< 11906 1726773101.67545: stderr chunk (state=3): >>><<< 11906 1726773101.67553: stdout chunk (state=3): >>><<< 11906 1726773101.67576: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11906 1726773101.67592: _low_level_execute_command(): starting 11906 1726773101.67599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241 `" && echo ansible-tmp-1726773101.675866-11906-169062181697241="` echo /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241 `" ) && sleep 0' 11906 1726773101.70444: stdout chunk (state=2): >>>ansible-tmp-1726773101.675866-11906-169062181697241=/root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241 <<< 11906 1726773101.70606: stderr chunk (state=3): >>><<< 11906 1726773101.70613: stdout chunk (state=3): >>><<< 11906 1726773101.70631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773101.675866-11906-169062181697241=/root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241 , stderr= 11906 1726773101.70722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11906 1726773101.70777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/AnsiballZ_stat.py 11906 1726773101.71119: Sending initial data 11906 1726773101.71134: Sent initial data (151 bytes) 11906 1726773101.73576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp6t44j69g /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/AnsiballZ_stat.py <<< 11906 1726773101.74570: stderr chunk (state=3): >>><<< 11906 1726773101.74577: stdout chunk (state=3): >>><<< 11906 1726773101.74602: done transferring module to remote 11906 1726773101.74617: _low_level_execute_command(): starting 11906 1726773101.74621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/ /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/AnsiballZ_stat.py && sleep 0' 11906 1726773101.77226: stderr chunk (state=2): >>><<< 11906 1726773101.77237: stdout chunk (state=2): >>><<< 11906 1726773101.77254: _low_level_execute_command() done: rc=0, stdout=, stderr= 11906 1726773101.77258: _low_level_execute_command(): starting 11906 1726773101.77264: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/AnsiballZ_stat.py && sleep 0' 11906 1726773101.92490: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11906 1726773101.93459: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11906 1726773101.93515: stderr chunk (state=3): >>><<< 11906 1726773101.93521: stdout chunk (state=3): >>><<< 11906 1726773101.93541: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 11906 1726773101.93566: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11906 1726773101.93577: _low_level_execute_command(): starting 11906 1726773101.93582: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773101.675866-11906-169062181697241/ > /dev/null 2>&1 && sleep 0' 11906 1726773101.96225: stderr chunk (state=2): >>><<< 11906 1726773101.96237: stdout chunk (state=2): >>><<< 11906 1726773101.96255: _low_level_execute_command() done: rc=0, stdout=, stderr= 11906 1726773101.96262: handler run complete 11906 1726773101.96290: attempt loop complete, returning result 11906 1726773101.96625: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11906 1726773101.96633: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11906 1726773101.96637: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11906 1726773101.96641: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11906 1726773101.96644: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11906 1726773101.96648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.96651: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11906 1726773101.96654: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11906 1726773101.96657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11906 1726773101.96696: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11906 1726773101.96701: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11906 1726773101.96703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11906 1726773101.96991: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11906 1726773101.96998: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11906 1726773101.97002: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11906 1726773101.97079: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11906 1726773101.97095: plugin lookup for stat failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11906 1726773101.97101: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11906 1726773101.97107: starting attempt loop 11906 1726773101.97111: running the handler 11906 1726773101.97117: _low_level_execute_command(): starting 11906 1726773101.97120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11906 1726773101.99458: stdout chunk (state=2): >>>/root <<< 11906 1726773101.99628: stderr chunk (state=3): >>><<< 11906 1726773101.99633: stdout chunk (state=3): >>><<< 11906 1726773101.99651: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11906 1726773101.99663: _low_level_execute_command(): starting 11906 1726773101.99669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438 `" && echo ansible-tmp-1726773101.996581-11906-40788688163438="` echo /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438 `" ) && sleep 0' 11906 1726773102.02457: stdout chunk (state=2): >>>ansible-tmp-1726773101.996581-11906-40788688163438=/root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438 <<< 11906 1726773102.02576: stderr chunk (state=3): >>><<< 11906 1726773102.02581: stdout chunk (state=3): >>><<< 11906 1726773102.02602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773101.996581-11906-40788688163438=/root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438 , stderr= 11906 1726773102.02678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11906 1726773102.02729: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/AnsiballZ_stat.py 11906 1726773102.03008: Sending initial data 11906 1726773102.03023: Sent initial data (150 bytes) 11906 1726773102.05426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpmx95ix80 /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/AnsiballZ_stat.py <<< 11906 1726773102.06400: stderr chunk (state=3): >>><<< 11906 1726773102.06404: stdout chunk (state=3): >>><<< 11906 1726773102.06427: done transferring module to remote 11906 1726773102.06438: _low_level_execute_command(): starting 11906 1726773102.06443: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/ /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/AnsiballZ_stat.py && sleep 0' 11906 1726773102.08946: stderr chunk (state=2): >>><<< 11906 1726773102.08957: stdout chunk (state=2): >>><<< 11906 1726773102.08975: _low_level_execute_command() done: rc=0, stdout=, stderr= 11906 1726773102.08978: _low_level_execute_command(): starting 11906 1726773102.08986: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/AnsiballZ_stat.py && sleep 0' 11906 1726773102.24061: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11906 1726773102.25094: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11906 1726773102.25143: stderr chunk (state=3): >>><<< 11906 1726773102.25148: stdout chunk (state=3): >>><<< 11906 1726773102.25169: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773035.2883239, "mtime": 1726773033.0853279, "ctime": 1726773033.0853279, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_md5": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.150 closed. 11906 1726773102.25228: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11906 1726773102.25240: _low_level_execute_command(): starting 11906 1726773102.25245: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773101.996581-11906-40788688163438/ > /dev/null 2>&1 && sleep 0' 11906 1726773102.27957: stderr chunk (state=2): >>><<< 11906 1726773102.27972: stdout chunk (state=2): >>><<< 11906 1726773102.27995: _low_level_execute_command() done: rc=0, stdout=, stderr= 11906 1726773102.28001: handler run complete 11906 1726773102.28044: attempt loop complete, returning result 11906 1726773102.28210: dumping result to json 11906 1726773102.28271: done dumping result, returning 11906 1726773102.28288: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [12a3200b-1e9d-1dbd-cc52-000000000e61] 11906 1726773102.28296: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e61 11906 1726773102.28300: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e61 11906 1726773102.28302: WORKER PROCESS EXITING ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773035.2883239, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773033.0853279, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773033.0853279, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8119 1726773102.28686: no more pending results, returning what we have 8119 1726773102.28692: results queue empty 8119 1726773102.28694: checking for any_errors_fatal 8119 1726773102.28698: done checking for any_errors_fatal 8119 1726773102.28699: checking for max_fail_percentage 8119 1726773102.28702: done checking for max_fail_percentage 8119 1726773102.28703: checking to see if all hosts have failed and the running result is not ok 8119 1726773102.28704: done checking to see if all hosts have failed 8119 1726773102.28706: getting the remaining hosts for this loop 8119 1726773102.28709: done getting the remaining hosts for this loop 8119 1726773102.28715: building list of next tasks for hosts 8119 1726773102.28717: getting the next task for host managed_node2 8119 1726773102.28723: done getting next task for host managed_node2 8119 1726773102.28726: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773102.28729: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.28731: done building task lists 8119 1726773102.28732: counting tasks in each state of execution 8119 1726773102.28735: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773102.28737: advancing hosts in ITERATING_TASKS 8119 1726773102.28738: starting to advance hosts 8119 1726773102.28740: getting the next task for host managed_node2 8119 1726773102.28743: done getting next task for host managed_node2 8119 1726773102.28745: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8119 1726773102.28747: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.28748: done advancing hosts to next task 8119 1726773102.28760: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773102.28763: getting variables 8119 1726773102.28765: in VariableManager get_vars() 8119 1726773102.28794: Calling all_inventory to load vars for managed_node2 8119 1726773102.28798: Calling groups_inventory to load vars for managed_node2 8119 1726773102.28800: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773102.28825: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.28836: Calling all_plugins_play to load vars for managed_node2 8119 1726773102.28846: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.28855: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773102.28865: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.28871: Calling groups_plugins_play to load vars for managed_node2 8119 1726773102.28880: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.28901: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.28919: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.29124: done with get_vars() 8119 1726773102.29134: done getting variables 8119 1726773102.29139: sending task start callback, copying the task so we can template it temporarily 8119 1726773102.29140: done copying, going to template now 8119 1726773102.29142: done templating 8119 1726773102.29144: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.685) 0:01:36.848 **** 8119 1726773102.29159: sending task start callback 8119 1726773102.29161: entering _queue_task() for managed_node2/set_fact 8119 1726773102.29288: worker is 1 (out of 1 available) 8119 1726773102.29329: exiting _queue_task() for managed_node2/set_fact 8119 1726773102.29402: done queuing things up, now waiting for results queue to drain 8119 1726773102.29411: waiting for pending results... 11925 1726773102.29465: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11925 1726773102.29522: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e62 11925 1726773102.29577: calling self._execute() 11925 1726773102.31306: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11925 1726773102.31388: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11925 1726773102.31454: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11925 1726773102.31481: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11925 1726773102.31515: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11925 1726773102.31546: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11925 1726773102.31592: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11925 1726773102.31618: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11925 1726773102.31635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11925 1726773102.31717: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11925 1726773102.31733: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11925 1726773102.31747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11925 1726773102.32154: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11925 1726773102.32189: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11925 1726773102.32199: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11925 1726773102.32212: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11925 1726773102.32217: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11925 1726773102.32311: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11925 1726773102.32331: starting attempt loop 11925 1726773102.32333: running the handler 11925 1726773102.32347: handler run complete 11925 1726773102.32351: attempt loop complete, returning result 11925 1726773102.32354: _execute() done 11925 1726773102.32355: dumping result to json 11925 1726773102.32357: done dumping result, returning 11925 1726773102.32361: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [12a3200b-1e9d-1dbd-cc52-000000000e62] 11925 1726773102.32368: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e62 11925 1726773102.32395: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e62 11925 1726773102.32440: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8119 1726773102.32565: no more pending results, returning what we have 8119 1726773102.32569: results queue empty 8119 1726773102.32571: checking for any_errors_fatal 8119 1726773102.32577: done checking for any_errors_fatal 8119 1726773102.32579: checking for max_fail_percentage 8119 1726773102.32581: done checking for max_fail_percentage 8119 1726773102.32586: checking to see if all hosts have failed and the running result is not ok 8119 1726773102.32588: done checking to see if all hosts have failed 8119 1726773102.32589: getting the remaining hosts for this loop 8119 1726773102.32592: done getting the remaining hosts for this loop 8119 1726773102.32599: building list of next tasks for hosts 8119 1726773102.32602: getting the next task for host managed_node2 8119 1726773102.32610: done getting next task for host managed_node2 8119 1726773102.32614: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773102.32618: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.32621: done building task lists 8119 1726773102.32623: counting tasks in each state of execution 8119 1726773102.32627: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773102.32629: advancing hosts in ITERATING_TASKS 8119 1726773102.32631: starting to advance hosts 8119 1726773102.32633: getting the next task for host managed_node2 8119 1726773102.32637: done getting next task for host managed_node2 8119 1726773102.32640: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8119 1726773102.32643: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.32645: done advancing hosts to next task 8119 1726773102.32660: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773102.32665: getting variables 8119 1726773102.32667: in VariableManager get_vars() 8119 1726773102.32705: Calling all_inventory to load vars for managed_node2 8119 1726773102.32712: Calling groups_inventory to load vars for managed_node2 8119 1726773102.32717: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773102.32744: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.32756: Calling all_plugins_play to load vars for managed_node2 8119 1726773102.32768: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.32777: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773102.32790: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.32797: Calling groups_plugins_play to load vars for managed_node2 8119 1726773102.32807: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.32826: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.32840: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.33042: done with get_vars() 8119 1726773102.33052: done getting variables 8119 1726773102.33057: sending task start callback, copying the task so we can template it temporarily 8119 1726773102.33059: done copying, going to template now 8119 1726773102.33061: done templating 8119 1726773102.33062: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.039) 0:01:36.887 **** 8119 1726773102.33078: sending task start callback 8119 1726773102.33079: entering _queue_task() for managed_node2/service 8119 1726773102.33205: worker is 1 (out of 1 available) 8119 1726773102.33242: exiting _queue_task() for managed_node2/service 8119 1726773102.33311: done queuing things up, now waiting for results queue to drain 8119 1726773102.33317: waiting for pending results... 11927 1726773102.33381: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11927 1726773102.33438: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e63 11927 1726773102.35076: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11927 1726773102.35180: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11927 1726773102.35234: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11927 1726773102.35260: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11927 1726773102.35293: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11927 1726773102.35325: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11927 1726773102.35366: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11927 1726773102.35395: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11927 1726773102.35424: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11927 1726773102.35500: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11927 1726773102.35521: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11927 1726773102.35540: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11927 1726773102.35706: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11927 1726773102.35712: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11927 1726773102.35715: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11927 1726773102.35717: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11927 1726773102.35718: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11927 1726773102.35720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.35722: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11927 1726773102.35724: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11927 1726773102.35725: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11927 1726773102.35739: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11927 1726773102.35742: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11927 1726773102.35744: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.35907: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11927 1726773102.35914: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11927 1726773102.35917: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11927 1726773102.35919: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11927 1726773102.35921: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11927 1726773102.35922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.35924: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11927 1726773102.35926: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11927 1726773102.35928: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11927 1726773102.35946: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11927 1726773102.35948: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11927 1726773102.35950: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.36044: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11927 1726773102.36076: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11927 1726773102.36088: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11927 1726773102.36101: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11927 1726773102.36109: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11927 1726773102.36199: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11927 1726773102.36220: starting attempt loop 11927 1726773102.36223: running the handler 11927 1726773102.36346: _low_level_execute_command(): starting 11927 1726773102.36352: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11927 1726773102.38851: stdout chunk (state=2): >>>/root <<< 11927 1726773102.38964: stderr chunk (state=3): >>><<< 11927 1726773102.38969: stdout chunk (state=3): >>><<< 11927 1726773102.38993: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11927 1726773102.39006: _low_level_execute_command(): starting 11927 1726773102.39014: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083 `" && echo ansible-tmp-1726773102.3900099-11927-102109977186083="` echo /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083 `" ) && sleep 0' 11927 1726773102.41837: stdout chunk (state=2): >>>ansible-tmp-1726773102.3900099-11927-102109977186083=/root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083 <<< 11927 1726773102.41967: stderr chunk (state=3): >>><<< 11927 1726773102.41973: stdout chunk (state=3): >>><<< 11927 1726773102.41994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773102.3900099-11927-102109977186083=/root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083 , stderr= 11927 1726773102.42116: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 11927 1726773102.42214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/AnsiballZ_systemd.py 11927 1726773102.42563: Sending initial data 11927 1726773102.42577: Sent initial data (155 bytes) 11927 1726773102.45035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp4d0l0lca /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/AnsiballZ_systemd.py <<< 11927 1726773102.46819: stderr chunk (state=3): >>><<< 11927 1726773102.46827: stdout chunk (state=3): >>><<< 11927 1726773102.46852: done transferring module to remote 11927 1726773102.46865: _low_level_execute_command(): starting 11927 1726773102.46870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/ /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/AnsiballZ_systemd.py && sleep 0' 11927 1726773102.49476: stderr chunk (state=2): >>><<< 11927 1726773102.49492: stdout chunk (state=2): >>><<< 11927 1726773102.49514: _low_level_execute_command() done: rc=0, stdout=, stderr= 11927 1726773102.49518: _low_level_execute_command(): starting 11927 1726773102.49525: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/AnsiballZ_systemd.py && sleep 0' 11927 1726773102.74878: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18944000", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11927 1726773102.74905: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 11927 1726773102.76392: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11927 1726773102.76439: stderr chunk (state=3): >>><<< 11927 1726773102.76444: stdout chunk (state=3): >>><<< 11927 1726773102.76463: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18944000", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11927 1726773102.76585: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11927 1726773102.76603: _low_level_execute_command(): starting 11927 1726773102.76613: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773102.3900099-11927-102109977186083/ > /dev/null 2>&1 && sleep 0' 11927 1726773102.79232: stderr chunk (state=2): >>><<< 11927 1726773102.79242: stdout chunk (state=2): >>><<< 11927 1726773102.79259: _low_level_execute_command() done: rc=0, stdout=, stderr= 11927 1726773102.79268: handler run complete 11927 1726773102.79275: attempt loop complete, returning result 11927 1726773102.79343: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11927 1726773102.79348: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 11927 1726773102.79351: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 11927 1726773102.79353: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 11927 1726773102.79356: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 11927 1726773102.79358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.79360: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 11927 1726773102.79362: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11927 1726773102.79364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11927 1726773102.79399: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11927 1726773102.79403: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11927 1726773102.79405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11927 1726773102.79558: dumping result to json 11927 1726773102.79676: done dumping result, returning 11927 1726773102.79695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [12a3200b-1e9d-1dbd-cc52-000000000e63] 11927 1726773102.79706: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e63 11927 1726773102.79711: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e63 11927 1726773102.79713: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "658", "MemoryAccounting": "yes", "MemoryCurrent": "18944000", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "WatchdogUSec": "0" } } 8119 1726773102.80160: no more pending results, returning what we have 8119 1726773102.80165: results queue empty 8119 1726773102.80167: checking for any_errors_fatal 8119 1726773102.80170: done checking for any_errors_fatal 8119 1726773102.80171: checking for max_fail_percentage 8119 1726773102.80173: done checking for max_fail_percentage 8119 1726773102.80175: checking to see if all hosts have failed and the running result is not ok 8119 1726773102.80176: done checking to see if all hosts have failed 8119 1726773102.80177: getting the remaining hosts for this loop 8119 1726773102.80179: done getting the remaining hosts for this loop 8119 1726773102.80185: building list of next tasks for hosts 8119 1726773102.80188: getting the next task for host managed_node2 8119 1726773102.80193: done getting next task for host managed_node2 8119 1726773102.80196: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773102.80199: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.80201: done building task lists 8119 1726773102.80202: counting tasks in each state of execution 8119 1726773102.80205: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773102.80207: advancing hosts in ITERATING_TASKS 8119 1726773102.80210: starting to advance hosts 8119 1726773102.80212: getting the next task for host managed_node2 8119 1726773102.80215: done getting next task for host managed_node2 8119 1726773102.80217: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8119 1726773102.80219: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773102.80220: done advancing hosts to next task 8119 1726773102.80231: getting variables 8119 1726773102.80234: in VariableManager get_vars() 8119 1726773102.80259: Calling all_inventory to load vars for managed_node2 8119 1726773102.80263: Calling groups_inventory to load vars for managed_node2 8119 1726773102.80265: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773102.80287: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80299: Calling all_plugins_play to load vars for managed_node2 8119 1726773102.80312: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80322: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773102.80333: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80338: Calling groups_plugins_play to load vars for managed_node2 8119 1726773102.80347: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80364: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80377: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773102.80574: done with get_vars() 8119 1726773102.80586: done getting variables 8119 1726773102.80592: sending task start callback, copying the task so we can template it temporarily 8119 1726773102.80593: done copying, going to template now 8119 1726773102.80595: done templating 8119 1726773102.80597: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.475) 0:01:37.362 **** 8119 1726773102.80614: sending task start callback 8119 1726773102.80616: entering _queue_task() for managed_node2/file 8119 1726773102.80739: worker is 1 (out of 1 available) 8119 1726773102.80777: exiting _queue_task() for managed_node2/file 8119 1726773102.80851: done queuing things up, now waiting for results queue to drain 8119 1726773102.80856: waiting for pending results... 11936 1726773102.80917: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11936 1726773102.80969: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e64 11936 1726773102.81017: calling self._execute() 11936 1726773102.82736: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11936 1726773102.82818: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11936 1726773102.82870: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11936 1726773102.82901: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11936 1726773102.82929: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11936 1726773102.82958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11936 1726773102.83003: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11936 1726773102.83027: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11936 1726773102.83044: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11936 1726773102.83179: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11936 1726773102.83201: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11936 1726773102.83219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11936 1726773102.83448: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11936 1726773102.83480: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11936 1726773102.83492: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11936 1726773102.83503: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11936 1726773102.83509: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11936 1726773102.83593: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11936 1726773102.83607: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11936 1726773102.83631: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11936 1726773102.83646: starting attempt loop 11936 1726773102.83648: running the handler 11936 1726773102.83658: _low_level_execute_command(): starting 11936 1726773102.83663: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11936 1726773102.86135: stdout chunk (state=2): >>>/root <<< 11936 1726773102.86252: stderr chunk (state=3): >>><<< 11936 1726773102.86257: stdout chunk (state=3): >>><<< 11936 1726773102.86277: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11936 1726773102.86293: _low_level_execute_command(): starting 11936 1726773102.86299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282 `" && echo ansible-tmp-1726773102.862875-11936-165276627058282="` echo /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282 `" ) && sleep 0' 11936 1726773102.89302: stdout chunk (state=2): >>>ansible-tmp-1726773102.862875-11936-165276627058282=/root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282 <<< 11936 1726773102.89425: stderr chunk (state=3): >>><<< 11936 1726773102.89430: stdout chunk (state=3): >>><<< 11936 1726773102.89450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773102.862875-11936-165276627058282=/root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282 , stderr= 11936 1726773102.89534: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11936 1726773102.89594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/AnsiballZ_file.py 11936 1726773102.89893: Sending initial data 11936 1726773102.89909: Sent initial data (151 bytes) 11936 1726773102.92326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpd0x9_eo4 /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/AnsiballZ_file.py <<< 11936 1726773102.93345: stderr chunk (state=3): >>><<< 11936 1726773102.93352: stdout chunk (state=3): >>><<< 11936 1726773102.93375: done transferring module to remote 11936 1726773102.93390: _low_level_execute_command(): starting 11936 1726773102.93396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/ /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/AnsiballZ_file.py && sleep 0' 11936 1726773102.95996: stderr chunk (state=2): >>><<< 11936 1726773102.96010: stdout chunk (state=2): >>><<< 11936 1726773102.96031: _low_level_execute_command() done: rc=0, stdout=, stderr= 11936 1726773102.96034: _low_level_execute_command(): starting 11936 1726773102.96041: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/AnsiballZ_file.py && sleep 0' 11936 1726773103.11965: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11936 1726773103.13037: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11936 1726773103.13086: stderr chunk (state=3): >>><<< 11936 1726773103.13091: stdout chunk (state=3): >>><<< 11936 1726773103.13114: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11936 1726773103.13150: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11936 1726773103.13162: _low_level_execute_command(): starting 11936 1726773103.13167: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773102.862875-11936-165276627058282/ > /dev/null 2>&1 && sleep 0' 11936 1726773103.15792: stderr chunk (state=2): >>><<< 11936 1726773103.15803: stdout chunk (state=2): >>><<< 11936 1726773103.15824: _low_level_execute_command() done: rc=0, stdout=, stderr= 11936 1726773103.15833: handler run complete 11936 1726773103.15840: attempt loop complete, returning result 11936 1726773103.15853: _execute() done 11936 1726773103.15855: dumping result to json 11936 1726773103.15859: done dumping result, returning 11936 1726773103.15872: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [12a3200b-1e9d-1dbd-cc52-000000000e64] 11936 1726773103.15888: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e64 11936 1726773103.15931: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e64 11936 1726773103.15935: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8119 1726773103.16194: no more pending results, returning what we have 8119 1726773103.16200: results queue empty 8119 1726773103.16203: checking for any_errors_fatal 8119 1726773103.16212: done checking for any_errors_fatal 8119 1726773103.16214: checking for max_fail_percentage 8119 1726773103.16217: done checking for max_fail_percentage 8119 1726773103.16219: checking to see if all hosts have failed and the running result is not ok 8119 1726773103.16221: done checking to see if all hosts have failed 8119 1726773103.16223: getting the remaining hosts for this loop 8119 1726773103.16226: done getting the remaining hosts for this loop 8119 1726773103.16233: building list of next tasks for hosts 8119 1726773103.16236: getting the next task for host managed_node2 8119 1726773103.16244: done getting next task for host managed_node2 8119 1726773103.16247: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773103.16250: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.16252: done building task lists 8119 1726773103.16253: counting tasks in each state of execution 8119 1726773103.16256: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773103.16258: advancing hosts in ITERATING_TASKS 8119 1726773103.16259: starting to advance hosts 8119 1726773103.16260: getting the next task for host managed_node2 8119 1726773103.16263: done getting next task for host managed_node2 8119 1726773103.16265: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8119 1726773103.16267: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.16269: done advancing hosts to next task 8119 1726773103.16280: getting variables 8119 1726773103.16285: in VariableManager get_vars() 8119 1726773103.16315: Calling all_inventory to load vars for managed_node2 8119 1726773103.16319: Calling groups_inventory to load vars for managed_node2 8119 1726773103.16321: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773103.16344: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16354: Calling all_plugins_play to load vars for managed_node2 8119 1726773103.16364: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16372: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773103.16384: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16392: Calling groups_plugins_play to load vars for managed_node2 8119 1726773103.16402: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16421: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.16635: done with get_vars() 8119 1726773103.16646: done getting variables 8119 1726773103.16650: sending task start callback, copying the task so we can template it temporarily 8119 1726773103.16652: done copying, going to template now 8119 1726773103.16654: done templating 8119 1726773103.16655: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.360) 0:01:37.723 **** 8119 1726773103.16671: sending task start callback 8119 1726773103.16673: entering _queue_task() for managed_node2/slurp 8119 1726773103.16795: worker is 1 (out of 1 available) 8119 1726773103.16834: exiting _queue_task() for managed_node2/slurp 8119 1726773103.16904: done queuing things up, now waiting for results queue to drain 8119 1726773103.16909: waiting for pending results... 11948 1726773103.16977: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11948 1726773103.17035: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e65 11948 1726773103.17079: calling self._execute() 11948 1726773103.18781: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11948 1726773103.18867: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11948 1726773103.18921: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11948 1726773103.18948: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11948 1726773103.18973: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11948 1726773103.19015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11948 1726773103.19058: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11948 1726773103.19080: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11948 1726773103.19100: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11948 1726773103.19239: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11948 1726773103.19258: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11948 1726773103.19272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11948 1726773103.19488: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11948 1726773103.19524: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11948 1726773103.19534: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11948 1726773103.19544: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11948 1726773103.19550: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11948 1726773103.19633: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11948 1726773103.19647: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11948 1726773103.19669: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11948 1726773103.19686: starting attempt loop 11948 1726773103.19690: running the handler 11948 1726773103.19699: _low_level_execute_command(): starting 11948 1726773103.19703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11948 1726773103.22133: stdout chunk (state=2): >>>/root <<< 11948 1726773103.22254: stderr chunk (state=3): >>><<< 11948 1726773103.22258: stdout chunk (state=3): >>><<< 11948 1726773103.22281: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11948 1726773103.22298: _low_level_execute_command(): starting 11948 1726773103.22304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200 `" && echo ansible-tmp-1726773103.222927-11948-72809627307200="` echo /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200 `" ) && sleep 0' 11948 1726773103.25267: stdout chunk (state=2): >>>ansible-tmp-1726773103.222927-11948-72809627307200=/root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200 <<< 11948 1726773103.25398: stderr chunk (state=3): >>><<< 11948 1726773103.25403: stdout chunk (state=3): >>><<< 11948 1726773103.25422: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773103.222927-11948-72809627307200=/root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200 , stderr= 11948 1726773103.25500: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/slurp-ZIP_DEFLATED 11948 1726773103.25556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/AnsiballZ_slurp.py 11948 1726773103.25852: Sending initial data 11948 1726773103.25866: Sent initial data (151 bytes) 11948 1726773103.28300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp1uydtq8o /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/AnsiballZ_slurp.py <<< 11948 1726773103.29264: stderr chunk (state=3): >>><<< 11948 1726773103.29269: stdout chunk (state=3): >>><<< 11948 1726773103.29293: done transferring module to remote 11948 1726773103.29308: _low_level_execute_command(): starting 11948 1726773103.29313: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/ /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/AnsiballZ_slurp.py && sleep 0' 11948 1726773103.31830: stderr chunk (state=2): >>><<< 11948 1726773103.31840: stdout chunk (state=2): >>><<< 11948 1726773103.31858: _low_level_execute_command() done: rc=0, stdout=, stderr= 11948 1726773103.31863: _low_level_execute_command(): starting 11948 1726773103.31870: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/AnsiballZ_slurp.py && sleep 0' 11948 1726773103.46167: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11948 1726773103.47107: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11948 1726773103.47155: stderr chunk (state=3): >>><<< 11948 1726773103.47164: stdout chunk (state=3): >>><<< 11948 1726773103.47188: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 11948 1726773103.47215: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11948 1726773103.47227: _low_level_execute_command(): starting 11948 1726773103.47232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773103.222927-11948-72809627307200/ > /dev/null 2>&1 && sleep 0' 11948 1726773103.49880: stderr chunk (state=2): >>><<< 11948 1726773103.49892: stdout chunk (state=2): >>><<< 11948 1726773103.49911: _low_level_execute_command() done: rc=0, stdout=, stderr= 11948 1726773103.49918: handler run complete 11948 1726773103.49946: attempt loop complete, returning result 11948 1726773103.49960: _execute() done 11948 1726773103.49962: dumping result to json 11948 1726773103.49964: done dumping result, returning 11948 1726773103.49977: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [12a3200b-1e9d-1dbd-cc52-000000000e65] 11948 1726773103.49991: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e65 11948 1726773103.50032: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e65 11948 1726773103.50035: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773103.50264: no more pending results, returning what we have 8119 1726773103.50269: results queue empty 8119 1726773103.50271: checking for any_errors_fatal 8119 1726773103.50277: done checking for any_errors_fatal 8119 1726773103.50279: checking for max_fail_percentage 8119 1726773103.50282: done checking for max_fail_percentage 8119 1726773103.50286: checking to see if all hosts have failed and the running result is not ok 8119 1726773103.50289: done checking to see if all hosts have failed 8119 1726773103.50291: getting the remaining hosts for this loop 8119 1726773103.50293: done getting the remaining hosts for this loop 8119 1726773103.50301: building list of next tasks for hosts 8119 1726773103.50304: getting the next task for host managed_node2 8119 1726773103.50311: done getting next task for host managed_node2 8119 1726773103.50317: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773103.50322: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.50324: done building task lists 8119 1726773103.50326: counting tasks in each state of execution 8119 1726773103.50328: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773103.50330: advancing hosts in ITERATING_TASKS 8119 1726773103.50332: starting to advance hosts 8119 1726773103.50333: getting the next task for host managed_node2 8119 1726773103.50336: done getting next task for host managed_node2 8119 1726773103.50338: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8119 1726773103.50340: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.50342: done advancing hosts to next task 8119 1726773103.50354: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773103.50356: getting variables 8119 1726773103.50358: in VariableManager get_vars() 8119 1726773103.50388: Calling all_inventory to load vars for managed_node2 8119 1726773103.50392: Calling groups_inventory to load vars for managed_node2 8119 1726773103.50395: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773103.50418: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50429: Calling all_plugins_play to load vars for managed_node2 8119 1726773103.50439: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50448: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773103.50458: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50464: Calling groups_plugins_play to load vars for managed_node2 8119 1726773103.50473: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50494: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50510: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.50710: done with get_vars() 8119 1726773103.50721: done getting variables 8119 1726773103.50726: sending task start callback, copying the task so we can template it temporarily 8119 1726773103.50728: done copying, going to template now 8119 1726773103.50729: done templating 8119 1726773103.50731: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.340) 0:01:38.063 **** 8119 1726773103.50746: sending task start callback 8119 1726773103.50748: entering _queue_task() for managed_node2/set_fact 8119 1726773103.50866: worker is 1 (out of 1 available) 8119 1726773103.50906: exiting _queue_task() for managed_node2/set_fact 8119 1726773103.50976: done queuing things up, now waiting for results queue to drain 8119 1726773103.50981: waiting for pending results... 11957 1726773103.51048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11957 1726773103.51102: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e66 11957 1726773103.51150: calling self._execute() 11957 1726773103.52850: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11957 1726773103.52930: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11957 1726773103.52982: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11957 1726773103.53012: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11957 1726773103.53038: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11957 1726773103.53066: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11957 1726773103.53111: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11957 1726773103.53135: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11957 1726773103.53152: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11957 1726773103.53245: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11957 1726773103.53262: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11957 1726773103.53276: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11957 1726773103.53591: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11957 1726773103.53625: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11957 1726773103.53635: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11957 1726773103.53646: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11957 1726773103.53652: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11957 1726773103.53744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11957 1726773103.53761: starting attempt loop 11957 1726773103.53763: running the handler 11957 1726773103.53777: handler run complete 11957 1726773103.53781: attempt loop complete, returning result 11957 1726773103.53784: _execute() done 11957 1726773103.53787: dumping result to json 11957 1726773103.53789: done dumping result, returning 11957 1726773103.53793: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [12a3200b-1e9d-1dbd-cc52-000000000e66] 11957 1726773103.53799: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e66 11957 1726773103.53826: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e66 11957 1726773103.53830: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8119 1726773103.53975: no more pending results, returning what we have 8119 1726773103.53979: results queue empty 8119 1726773103.53981: checking for any_errors_fatal 8119 1726773103.53988: done checking for any_errors_fatal 8119 1726773103.53990: checking for max_fail_percentage 8119 1726773103.53993: done checking for max_fail_percentage 8119 1726773103.53995: checking to see if all hosts have failed and the running result is not ok 8119 1726773103.53997: done checking to see if all hosts have failed 8119 1726773103.53999: getting the remaining hosts for this loop 8119 1726773103.54001: done getting the remaining hosts for this loop 8119 1726773103.54008: building list of next tasks for hosts 8119 1726773103.54011: getting the next task for host managed_node2 8119 1726773103.54019: done getting next task for host managed_node2 8119 1726773103.54023: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773103.54028: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.54030: done building task lists 8119 1726773103.54032: counting tasks in each state of execution 8119 1726773103.54036: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773103.54038: advancing hosts in ITERATING_TASKS 8119 1726773103.54040: starting to advance hosts 8119 1726773103.54043: getting the next task for host managed_node2 8119 1726773103.54047: done getting next task for host managed_node2 8119 1726773103.54050: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8119 1726773103.54053: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773103.54056: done advancing hosts to next task 8119 1726773103.54070: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773103.54074: getting variables 8119 1726773103.54076: in VariableManager get_vars() 8119 1726773103.54111: Calling all_inventory to load vars for managed_node2 8119 1726773103.54115: Calling groups_inventory to load vars for managed_node2 8119 1726773103.54118: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773103.54137: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54147: Calling all_plugins_play to load vars for managed_node2 8119 1726773103.54157: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54165: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773103.54174: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54180: Calling groups_plugins_play to load vars for managed_node2 8119 1726773103.54192: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54212: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54226: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773103.54424: done with get_vars() 8119 1726773103.54434: done getting variables 8119 1726773103.54439: sending task start callback, copying the task so we can template it temporarily 8119 1726773103.54441: done copying, going to template now 8119 1726773103.54443: done templating 8119 1726773103.54444: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.037) 0:01:38.101 **** 8119 1726773103.54459: sending task start callback 8119 1726773103.54461: entering _queue_task() for managed_node2/copy 8119 1726773103.54572: worker is 1 (out of 1 available) 8119 1726773103.54609: exiting _queue_task() for managed_node2/copy 8119 1726773103.54676: done queuing things up, now waiting for results queue to drain 8119 1726773103.54681: waiting for pending results... 11959 1726773103.54748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11959 1726773103.54801: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e67 11959 1726773103.54847: calling self._execute() 11959 1726773103.56537: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11959 1726773103.56618: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11959 1726773103.56670: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11959 1726773103.56698: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11959 1726773103.56798: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11959 1726773103.56830: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11959 1726773103.56870: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11959 1726773103.56895: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11959 1726773103.56920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11959 1726773103.56992: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11959 1726773103.57011: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11959 1726773103.57030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11959 1726773103.57298: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11959 1726773103.57332: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11959 1726773103.57342: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11959 1726773103.57352: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11959 1726773103.57357: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11959 1726773103.57448: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11959 1726773103.57456: starting attempt loop 11959 1726773103.57458: running the handler 11959 1726773103.57466: _low_level_execute_command(): starting 11959 1726773103.57470: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11959 1726773103.59972: stdout chunk (state=2): >>>/root <<< 11959 1726773103.60090: stderr chunk (state=3): >>><<< 11959 1726773103.60095: stdout chunk (state=3): >>><<< 11959 1726773103.60118: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11959 1726773103.60131: _low_level_execute_command(): starting 11959 1726773103.60137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195 `" && echo ansible-tmp-1726773103.6012619-11959-246390025203195="` echo /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195 `" ) && sleep 0' 11959 1726773103.63312: stdout chunk (state=2): >>>ansible-tmp-1726773103.6012619-11959-246390025203195=/root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195 <<< 11959 1726773103.63436: stderr chunk (state=3): >>><<< 11959 1726773103.63442: stdout chunk (state=3): >>><<< 11959 1726773103.63460: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773103.6012619-11959-246390025203195=/root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195 , stderr= 11959 1726773103.63606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11959 1726773103.63660: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_stat.py 11959 1726773103.63965: Sending initial data 11959 1726773103.63980: Sent initial data (152 bytes) 11959 1726773103.66426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpwb3xh6ep /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_stat.py <<< 11959 1726773103.67405: stderr chunk (state=3): >>><<< 11959 1726773103.67413: stdout chunk (state=3): >>><<< 11959 1726773103.67439: done transferring module to remote 11959 1726773103.67454: _low_level_execute_command(): starting 11959 1726773103.67458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/ /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_stat.py && sleep 0' 11959 1726773103.70035: stderr chunk (state=2): >>><<< 11959 1726773103.70047: stdout chunk (state=2): >>><<< 11959 1726773103.70066: _low_level_execute_command() done: rc=0, stdout=, stderr= 11959 1726773103.70070: _low_level_execute_command(): starting 11959 1726773103.70076: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_stat.py && sleep 0' 11959 1726773103.85541: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773103.4588535, "mtime": 1726773095.0338898, "ctime": 1726773095.0338898, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11959 1726773103.86603: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11959 1726773103.86651: stderr chunk (state=3): >>><<< 11959 1726773103.86657: stdout chunk (state=3): >>><<< 11959 1726773103.86677: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773103.4588535, "mtime": 1726773095.0338898, "ctime": 1726773095.0338898, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11959 1726773103.86743: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11959 1726773103.86835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11959 1726773103.86886: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_file.py 11959 1726773103.87229: Sending initial data 11959 1726773103.87244: Sent initial data (152 bytes) 11959 1726773103.90015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpzcu00fhm /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_file.py <<< 11959 1726773103.91054: stderr chunk (state=3): >>><<< 11959 1726773103.91060: stdout chunk (state=3): >>><<< 11959 1726773103.91081: done transferring module to remote 11959 1726773103.91096: _low_level_execute_command(): starting 11959 1726773103.91100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/ /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_file.py && sleep 0' 11959 1726773103.93655: stderr chunk (state=2): >>><<< 11959 1726773103.93666: stdout chunk (state=2): >>><<< 11959 1726773103.93688: _low_level_execute_command() done: rc=0, stdout=, stderr= 11959 1726773103.93693: _low_level_execute_command(): starting 11959 1726773103.93699: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/AnsiballZ_file.py && sleep 0' 11959 1726773104.09306: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp76h24a_d", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11959 1726773104.10336: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11959 1726773104.10384: stderr chunk (state=3): >>><<< 11959 1726773104.10391: stdout chunk (state=3): >>><<< 11959 1726773104.10415: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp76h24a_d", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11959 1726773104.13665: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp76h24a_d', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11959 1726773104.13680: _low_level_execute_command(): starting 11959 1726773104.13686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773103.6012619-11959-246390025203195/ > /dev/null 2>&1 && sleep 0' 11959 1726773104.16363: stderr chunk (state=2): >>><<< 11959 1726773104.16375: stdout chunk (state=2): >>><<< 11959 1726773104.16396: _low_level_execute_command() done: rc=0, stdout=, stderr= 11959 1726773104.16406: handler run complete 11959 1726773104.16439: attempt loop complete, returning result 11959 1726773104.16452: _execute() done 11959 1726773104.16454: dumping result to json 11959 1726773104.16457: done dumping result, returning 11959 1726773104.16471: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [12a3200b-1e9d-1dbd-cc52-000000000e67] 11959 1726773104.16487: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e67 11959 1726773104.16524: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e67 11959 1726773104.16528: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8119 1726773104.16694: no more pending results, returning what we have 8119 1726773104.16702: results queue empty 8119 1726773104.16704: checking for any_errors_fatal 8119 1726773104.16709: done checking for any_errors_fatal 8119 1726773104.16711: checking for max_fail_percentage 8119 1726773104.16715: done checking for max_fail_percentage 8119 1726773104.16717: checking to see if all hosts have failed and the running result is not ok 8119 1726773104.16719: done checking to see if all hosts have failed 8119 1726773104.16721: getting the remaining hosts for this loop 8119 1726773104.16723: done getting the remaining hosts for this loop 8119 1726773104.16731: building list of next tasks for hosts 8119 1726773104.16734: getting the next task for host managed_node2 8119 1726773104.16741: done getting next task for host managed_node2 8119 1726773104.16745: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773104.16750: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773104.16752: done building task lists 8119 1726773104.16754: counting tasks in each state of execution 8119 1726773104.16758: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773104.16760: advancing hosts in ITERATING_TASKS 8119 1726773104.16762: starting to advance hosts 8119 1726773104.16764: getting the next task for host managed_node2 8119 1726773104.16768: done getting next task for host managed_node2 8119 1726773104.16771: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8119 1726773104.16774: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773104.16776: done advancing hosts to next task 8119 1726773104.16793: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773104.16798: getting variables 8119 1726773104.16800: in VariableManager get_vars() 8119 1726773104.16835: Calling all_inventory to load vars for managed_node2 8119 1726773104.16840: Calling groups_inventory to load vars for managed_node2 8119 1726773104.16844: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773104.16873: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.16890: Calling all_plugins_play to load vars for managed_node2 8119 1726773104.16907: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.16921: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773104.16938: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.16947: Calling groups_plugins_play to load vars for managed_node2 8119 1726773104.16961: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.16989: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.17007: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.17223: done with get_vars() 8119 1726773104.17234: done getting variables 8119 1726773104.17238: sending task start callback, copying the task so we can template it temporarily 8119 1726773104.17240: done copying, going to template now 8119 1726773104.17242: done templating 8119 1726773104.17243: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:44 -0400 (0:00:00.627) 0:01:38.729 **** 8119 1726773104.17260: sending task start callback 8119 1726773104.17261: entering _queue_task() for managed_node2/copy 8119 1726773104.17386: worker is 1 (out of 1 available) 8119 1726773104.17425: exiting _queue_task() for managed_node2/copy 8119 1726773104.17500: done queuing things up, now waiting for results queue to drain 8119 1726773104.17505: waiting for pending results... 11975 1726773104.17572: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11975 1726773104.17629: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e68 11975 1726773104.17674: calling self._execute() 11975 1726773104.19374: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11975 1726773104.19474: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11975 1726773104.19529: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11975 1726773104.19555: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11975 1726773104.19587: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11975 1726773104.19619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11975 1726773104.19661: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11975 1726773104.19685: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11975 1726773104.19706: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11975 1726773104.19782: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11975 1726773104.19803: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11975 1726773104.19823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11975 1726773104.20053: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11975 1726773104.20086: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11975 1726773104.20097: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11975 1726773104.20107: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11975 1726773104.20115: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11975 1726773104.20212: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11975 1726773104.20229: starting attempt loop 11975 1726773104.20231: running the handler 11975 1726773104.20239: _low_level_execute_command(): starting 11975 1726773104.20243: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11975 1726773104.22719: stdout chunk (state=2): >>>/root <<< 11975 1726773104.22839: stderr chunk (state=3): >>><<< 11975 1726773104.22847: stdout chunk (state=3): >>><<< 11975 1726773104.22869: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11975 1726773104.22885: _low_level_execute_command(): starting 11975 1726773104.22892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820 `" && echo ansible-tmp-1726773104.22878-11975-34898178968820="` echo /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820 `" ) && sleep 0' 11975 1726773104.25867: stdout chunk (state=2): >>>ansible-tmp-1726773104.22878-11975-34898178968820=/root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820 <<< 11975 1726773104.25997: stderr chunk (state=3): >>><<< 11975 1726773104.26004: stdout chunk (state=3): >>><<< 11975 1726773104.26025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773104.22878-11975-34898178968820=/root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820 , stderr= 11975 1726773104.26169: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 11975 1726773104.26231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_stat.py 11975 1726773104.26555: Sending initial data 11975 1726773104.26569: Sent initial data (149 bytes) 11975 1726773104.29024: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmppn3kadzy /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_stat.py <<< 11975 1726773104.30011: stderr chunk (state=3): >>><<< 11975 1726773104.30017: stdout chunk (state=3): >>><<< 11975 1726773104.30039: done transferring module to remote 11975 1726773104.30053: _low_level_execute_command(): starting 11975 1726773104.30057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/ /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_stat.py && sleep 0' 11975 1726773104.32598: stderr chunk (state=2): >>><<< 11975 1726773104.32610: stdout chunk (state=2): >>><<< 11975 1726773104.32632: _low_level_execute_command() done: rc=0, stdout=, stderr= 11975 1726773104.32636: _low_level_execute_command(): starting 11975 1726773104.32642: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_stat.py && sleep 0' 11975 1726773104.48175: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773093.1698978, "mtime": 1726773095.0338898, "ctime": 1726773095.0338898, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 11975 1726773104.49235: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11975 1726773104.49286: stderr chunk (state=3): >>><<< 11975 1726773104.49291: stdout chunk (state=3): >>><<< 11975 1726773104.49312: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773093.1698978, "mtime": 1726773095.0338898, "ctime": 1726773095.0338898, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 11975 1726773104.49370: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11975 1726773104.49460: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 11975 1726773104.49512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_file.py 11975 1726773104.49853: Sending initial data 11975 1726773104.49867: Sent initial data (149 bytes) 11975 1726773104.52359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpw36xr48u /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_file.py <<< 11975 1726773104.53367: stderr chunk (state=3): >>><<< 11975 1726773104.53371: stdout chunk (state=3): >>><<< 11975 1726773104.53392: done transferring module to remote 11975 1726773104.53405: _low_level_execute_command(): starting 11975 1726773104.53409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/ /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_file.py && sleep 0' 11975 1726773104.55927: stderr chunk (state=2): >>><<< 11975 1726773104.55939: stdout chunk (state=2): >>><<< 11975 1726773104.55958: _low_level_execute_command() done: rc=0, stdout=, stderr= 11975 1726773104.55961: _low_level_execute_command(): starting 11975 1726773104.55968: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/AnsiballZ_file.py && sleep 0' 11975 1726773104.72010: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp8ilicps5", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 11975 1726773104.73038: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11975 1726773104.73086: stderr chunk (state=3): >>><<< 11975 1726773104.73091: stdout chunk (state=3): >>><<< 11975 1726773104.73114: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp8ilicps5", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 11975 1726773104.73146: done with _execute_module (file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmp8ilicps5', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11975 1726773104.73159: _low_level_execute_command(): starting 11975 1726773104.73163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773104.22878-11975-34898178968820/ > /dev/null 2>&1 && sleep 0' 11975 1726773104.75827: stderr chunk (state=2): >>><<< 11975 1726773104.75839: stdout chunk (state=2): >>><<< 11975 1726773104.75857: _low_level_execute_command() done: rc=0, stdout=, stderr= 11975 1726773104.75866: handler run complete 11975 1726773104.75900: attempt loop complete, returning result 11975 1726773104.75915: _execute() done 11975 1726773104.75918: dumping result to json 11975 1726773104.75922: done dumping result, returning 11975 1726773104.75939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [12a3200b-1e9d-1dbd-cc52-000000000e68] 11975 1726773104.75956: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e68 11975 1726773104.75997: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e68 11975 1726773104.76041: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8119 1726773104.76204: no more pending results, returning what we have 8119 1726773104.76210: results queue empty 8119 1726773104.76213: checking for any_errors_fatal 8119 1726773104.76218: done checking for any_errors_fatal 8119 1726773104.76220: checking for max_fail_percentage 8119 1726773104.76223: done checking for max_fail_percentage 8119 1726773104.76225: checking to see if all hosts have failed and the running result is not ok 8119 1726773104.76227: done checking to see if all hosts have failed 8119 1726773104.76230: getting the remaining hosts for this loop 8119 1726773104.76232: done getting the remaining hosts for this loop 8119 1726773104.76240: building list of next tasks for hosts 8119 1726773104.76243: getting the next task for host managed_node2 8119 1726773104.76250: done getting next task for host managed_node2 8119 1726773104.76254: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773104.76259: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773104.76262: done building task lists 8119 1726773104.76264: counting tasks in each state of execution 8119 1726773104.76268: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773104.76270: advancing hosts in ITERATING_TASKS 8119 1726773104.76272: starting to advance hosts 8119 1726773104.76274: getting the next task for host managed_node2 8119 1726773104.76279: done getting next task for host managed_node2 8119 1726773104.76282: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8119 1726773104.76287: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773104.76290: done advancing hosts to next task 8119 1726773104.76333: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 8119 1726773104.76338: getting variables 8119 1726773104.76340: in VariableManager get_vars() 8119 1726773104.76368: Calling all_inventory to load vars for managed_node2 8119 1726773104.76371: Calling groups_inventory to load vars for managed_node2 8119 1726773104.76373: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773104.76399: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76413: Calling all_plugins_play to load vars for managed_node2 8119 1726773104.76426: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76435: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773104.76445: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76450: Calling groups_plugins_play to load vars for managed_node2 8119 1726773104.76460: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76477: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76493: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773104.76693: done with get_vars() 8119 1726773104.76704: done getting variables 8119 1726773104.76708: sending task start callback, copying the task so we can template it temporarily 8119 1726773104.76711: done copying, going to template now 8119 1726773104.76712: done templating 8119 1726773104.76714: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:44 -0400 (0:00:00.594) 0:01:39.323 **** 8119 1726773104.76730: sending task start callback 8119 1726773104.76732: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773104.76851: worker is 1 (out of 1 available) 8119 1726773104.76893: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8119 1726773104.76964: done queuing things up, now waiting for results queue to drain 8119 1726773104.76970: waiting for pending results... 11988 1726773104.77036: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11988 1726773104.77090: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e69 11988 1726773104.77138: calling self._execute() 11988 1726773104.78852: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 11988 1726773104.78931: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 11988 1726773104.78984: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 11988 1726773104.79014: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 11988 1726773104.79040: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 11988 1726773104.79068: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 11988 1726773104.79119: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 11988 1726773104.79142: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 11988 1726773104.79159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 11988 1726773104.79249: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 11988 1726773104.79266: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 11988 1726773104.79279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 11988 1726773104.79561: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 11988 1726773104.79595: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11988 1726773104.79606: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 11988 1726773104.79618: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11988 1726773104.79624: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11988 1726773104.79703: plugin lookup for fedora.linux_system_roles.kernel_settings_get_config failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11988 1726773104.79718: plugin lookup for fedora.linux_system_roles.kernel failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 11988 1726773104.79741: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 11988 1726773104.79758: starting attempt loop 11988 1726773104.79761: running the handler 11988 1726773104.79771: _low_level_execute_command(): starting 11988 1726773104.79775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11988 1726773104.82241: stdout chunk (state=2): >>>/root <<< 11988 1726773104.82360: stderr chunk (state=3): >>><<< 11988 1726773104.82365: stdout chunk (state=3): >>><<< 11988 1726773104.82386: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11988 1726773104.82400: _low_level_execute_command(): starting 11988 1726773104.82405: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700 `" && echo ansible-tmp-1726773104.8239477-11988-85709297134700="` echo /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700 `" ) && sleep 0' 11988 1726773104.85249: stdout chunk (state=2): >>>ansible-tmp-1726773104.8239477-11988-85709297134700=/root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700 <<< 11988 1726773104.85375: stderr chunk (state=3): >>><<< 11988 1726773104.85380: stdout chunk (state=3): >>><<< 11988 1726773104.85400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773104.8239477-11988-85709297134700=/root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700 , stderr= 11988 1726773104.85482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/fedora.linux_system_roles.kernel_settings_get_config-ZIP_DEFLATED 11988 1726773104.85539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/AnsiballZ_kernel_settings_get_config.py 11988 1726773104.85863: Sending initial data 11988 1726773104.85877: Sent initial data (173 bytes) 11988 1726773104.88306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpp4sdykc7 /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/AnsiballZ_kernel_settings_get_config.py <<< 11988 1726773104.89639: stderr chunk (state=3): >>><<< 11988 1726773104.89646: stdout chunk (state=3): >>><<< 11988 1726773104.89672: done transferring module to remote 11988 1726773104.89690: _low_level_execute_command(): starting 11988 1726773104.89695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/ /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11988 1726773104.92298: stderr chunk (state=2): >>><<< 11988 1726773104.92314: stdout chunk (state=2): >>><<< 11988 1726773104.92337: _low_level_execute_command() done: rc=0, stdout=, stderr= 11988 1726773104.92341: _low_level_execute_command(): starting 11988 1726773104.92347: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11988 1726773105.07532: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11988 1726773105.08555: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 11988 1726773105.08603: stderr chunk (state=3): >>><<< 11988 1726773105.08609: stdout chunk (state=3): >>><<< 11988 1726773105.08630: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.150 closed. 11988 1726773105.08657: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11988 1726773105.08670: _low_level_execute_command(): starting 11988 1726773105.08675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773104.8239477-11988-85709297134700/ > /dev/null 2>&1 && sleep 0' 11988 1726773105.11350: stderr chunk (state=2): >>><<< 11988 1726773105.11362: stdout chunk (state=2): >>><<< 11988 1726773105.11381: _low_level_execute_command() done: rc=0, stdout=, stderr= 11988 1726773105.11390: handler run complete 11988 1726773105.11418: attempt loop complete, returning result 11988 1726773105.11432: _execute() done 11988 1726773105.11434: dumping result to json 11988 1726773105.11437: done dumping result, returning 11988 1726773105.11451: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [12a3200b-1e9d-1dbd-cc52-000000000e69] 11988 1726773105.11465: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e69 11988 1726773105.11506: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e69 11988 1726773105.11568: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "vm": { "transparent_hugepages": "never" } } } 8119 1726773105.11680: no more pending results, returning what we have 8119 1726773105.11687: results queue empty 8119 1726773105.11689: checking for any_errors_fatal 8119 1726773105.11694: done checking for any_errors_fatal 8119 1726773105.11696: checking for max_fail_percentage 8119 1726773105.11699: done checking for max_fail_percentage 8119 1726773105.11701: checking to see if all hosts have failed and the running result is not ok 8119 1726773105.11703: done checking to see if all hosts have failed 8119 1726773105.11705: getting the remaining hosts for this loop 8119 1726773105.11707: done getting the remaining hosts for this loop 8119 1726773105.11715: building list of next tasks for hosts 8119 1726773105.11718: getting the next task for host managed_node2 8119 1726773105.11726: done getting next task for host managed_node2 8119 1726773105.11730: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773105.11734: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.11737: done building task lists 8119 1726773105.11739: counting tasks in each state of execution 8119 1726773105.11743: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773105.11745: advancing hosts in ITERATING_TASKS 8119 1726773105.11747: starting to advance hosts 8119 1726773105.11750: getting the next task for host managed_node2 8119 1726773105.11754: done getting next task for host managed_node2 8119 1726773105.11757: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8119 1726773105.11760: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.11762: done advancing hosts to next task 8119 1726773105.11777: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773105.11781: getting variables 8119 1726773105.11787: in VariableManager get_vars() 8119 1726773105.11823: Calling all_inventory to load vars for managed_node2 8119 1726773105.11828: Calling groups_inventory to load vars for managed_node2 8119 1726773105.11831: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773105.11859: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.11870: Calling all_plugins_play to load vars for managed_node2 8119 1726773105.11881: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.11894: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773105.11905: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.11912: Calling groups_plugins_play to load vars for managed_node2 8119 1726773105.11924: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.11942: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.11955: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.12163: done with get_vars() 8119 1726773105.12173: done getting variables 8119 1726773105.12178: sending task start callback, copying the task so we can template it temporarily 8119 1726773105.12180: done copying, going to template now 8119 1726773105.12182: done templating 8119 1726773105.12186: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:45 -0400 (0:00:00.354) 0:01:39.678 **** 8119 1726773105.12202: sending task start callback 8119 1726773105.12204: entering _queue_task() for managed_node2/template 8119 1726773105.12323: worker is 1 (out of 1 available) 8119 1726773105.12363: exiting _queue_task() for managed_node2/template 8119 1726773105.12436: done queuing things up, now waiting for results queue to drain 8119 1726773105.12442: waiting for pending results... 12000 1726773105.12512: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 12000 1726773105.12566: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6a 12000 1726773105.12615: calling self._execute() 12000 1726773105.17937: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12000 1726773105.18038: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12000 1726773105.18102: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12000 1726773105.18138: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12000 1726773105.18175: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12000 1726773105.18211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12000 1726773105.18260: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12000 1726773105.18300: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12000 1726773105.18325: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12000 1726773105.18423: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12000 1726773105.18447: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12000 1726773105.18467: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12000 1726773105.18920: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12000 1726773105.18971: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12000 1726773105.18986: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12000 1726773105.19003: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12000 1726773105.19009: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12000 1726773105.19124: Loading ActionModule 'template' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12000 1726773105.19134: starting attempt loop 12000 1726773105.19137: running the handler 12000 1726773105.19144: _low_level_execute_command(): starting 12000 1726773105.19148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12000 1726773105.21933: stdout chunk (state=2): >>>/root <<< 12000 1726773105.22051: stderr chunk (state=3): >>><<< 12000 1726773105.22057: stdout chunk (state=3): >>><<< 12000 1726773105.22082: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12000 1726773105.22100: _low_level_execute_command(): starting 12000 1726773105.22107: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571 `" && echo ansible-tmp-1726773105.2209427-12000-249795935894571="` echo /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571 `" ) && sleep 0' 12000 1726773105.25126: stdout chunk (state=2): >>>ansible-tmp-1726773105.2209427-12000-249795935894571=/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571 <<< 12000 1726773105.25254: stderr chunk (state=3): >>><<< 12000 1726773105.25260: stdout chunk (state=3): >>><<< 12000 1726773105.25277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773105.2209427-12000-249795935894571=/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571 , stderr= 12000 1726773105.25306: evaluation_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 12000 1726773105.25327: search_path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 12000 1726773105.26794: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.26800: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.26803: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.26806: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.26808: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.26811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.26813: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.26815: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.26817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.26835: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.26838: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.26840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27078: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27085: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.27088: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.27090: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.27092: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.27094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27096: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.27098: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.27099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.27115: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27118: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.27120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27147: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27150: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.27152: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.27154: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.27156: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.27157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27159: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.27161: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.27163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.27176: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27178: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.27180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27328: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27332: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.27335: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.27336: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.27338: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.27340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27342: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.27344: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.27345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.27363: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27366: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.27368: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27594: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27598: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.27600: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.27602: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.27605: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.27608: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27610: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.27612: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.27614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.27628: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27631: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.27632: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27655: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27657: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12000 1726773105.27659: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12000 1726773105.27661: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12000 1726773105.27663: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12000 1726773105.27665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.27666: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12000 1726773105.27668: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12000 1726773105.27670: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12000 1726773105.27684: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12000 1726773105.27688: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12000 1726773105.27690: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12000 1726773105.28839: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12000 1726773105.28914: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 12000 1726773105.28962: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_stat.py 12000 1726773105.29312: Sending initial data 12000 1726773105.29327: Sent initial data (152 bytes) 12000 1726773105.31866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpz8rw6t5w /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_stat.py <<< 12000 1726773105.32874: stderr chunk (state=3): >>><<< 12000 1726773105.32880: stdout chunk (state=3): >>><<< 12000 1726773105.32905: done transferring module to remote 12000 1726773105.32925: _low_level_execute_command(): starting 12000 1726773105.32931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/ /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_stat.py && sleep 0' 12000 1726773105.35499: stderr chunk (state=2): >>><<< 12000 1726773105.35512: stdout chunk (state=2): >>><<< 12000 1726773105.35533: _low_level_execute_command() done: rc=0, stdout=, stderr= 12000 1726773105.35536: _low_level_execute_command(): starting 12000 1726773105.35543: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_stat.py && sleep 0' 12000 1726773105.51440: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395522, "dev": 51713, "nlink": 1, "atime": 1726773095.0208898, "mtime": 1726773094.2128932, "ctime": 1726773094.4588923, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "3649374478", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 12000 1726773105.52440: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12000 1726773105.52491: stderr chunk (state=3): >>><<< 12000 1726773105.52497: stdout chunk (state=3): >>><<< 12000 1726773105.52519: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395522, "dev": 51713, "nlink": 1, "atime": 1726773095.0208898, "mtime": 1726773094.2128932, "ctime": 1726773094.4588923, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "3649374478", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 12000 1726773105.52608: done with _execute_module (stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12000 1726773105.52952: Sending initial data 12000 1726773105.52967: Sent initial data (160 bytes) 12000 1726773105.55468: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp8c78_o9f/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source <<< 12000 1726773105.55764: stderr chunk (state=3): >>><<< 12000 1726773105.55769: stdout chunk (state=3): >>><<< 12000 1726773105.55793: _low_level_execute_command(): starting 12000 1726773105.55798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/ /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source && sleep 0' 12000 1726773105.58291: stderr chunk (state=2): >>><<< 12000 1726773105.58306: stdout chunk (state=2): >>><<< 12000 1726773105.58327: _low_level_execute_command() done: rc=0, stdout=, stderr= 12000 1726773105.58437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 12000 1726773105.58487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_copy.py 12000 1726773105.58799: Sending initial data 12000 1726773105.58814: Sent initial data (152 bytes) 12000 1726773105.61230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp1ntux339 /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_copy.py <<< 12000 1726773105.62219: stderr chunk (state=3): >>><<< 12000 1726773105.62226: stdout chunk (state=3): >>><<< 12000 1726773105.62249: done transferring module to remote 12000 1726773105.62262: _low_level_execute_command(): starting 12000 1726773105.62267: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/ /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_copy.py && sleep 0' 12000 1726773105.64758: stderr chunk (state=2): >>><<< 12000 1726773105.64768: stdout chunk (state=2): >>><<< 12000 1726773105.64790: _low_level_execute_command() done: rc=0, stdout=, stderr= 12000 1726773105.64794: _low_level_execute_command(): starting 12000 1726773105.64801: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/AnsiballZ_copy.py && sleep 0' 12000 1726773105.80565: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 12000 1726773105.81657: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12000 1726773105.81710: stderr chunk (state=3): >>><<< 12000 1726773105.81715: stdout chunk (state=3): >>><<< 12000 1726773105.81736: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12000 1726773105.81766: done with _execute_module (copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12000 1726773105.81801: _low_level_execute_command(): starting 12000 1726773105.81812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/ > /dev/null 2>&1 && sleep 0' 12000 1726773105.84428: stderr chunk (state=2): >>><<< 12000 1726773105.84438: stdout chunk (state=2): >>><<< 12000 1726773105.84457: _low_level_execute_command() done: rc=0, stdout=, stderr= 12000 1726773105.84485: handler run complete 12000 1726773105.84519: attempt loop complete, returning result 12000 1726773105.84524: _execute() done 12000 1726773105.84526: dumping result to json 12000 1726773105.84530: done dumping result, returning 12000 1726773105.84543: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [12a3200b-1e9d-1dbd-cc52-000000000e6a] 12000 1726773105.84554: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6a 12000 1726773105.84591: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6a 12000 1726773105.84665: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726773105.2209427-12000-249795935894571/source", "state": "file", "uid": 0 } 8119 1726773105.84798: no more pending results, returning what we have 8119 1726773105.84804: results queue empty 8119 1726773105.84806: checking for any_errors_fatal 8119 1726773105.84812: done checking for any_errors_fatal 8119 1726773105.84815: checking for max_fail_percentage 8119 1726773105.84817: done checking for max_fail_percentage 8119 1726773105.84819: checking to see if all hosts have failed and the running result is not ok 8119 1726773105.84821: done checking to see if all hosts have failed 8119 1726773105.84823: getting the remaining hosts for this loop 8119 1726773105.84825: done getting the remaining hosts for this loop 8119 1726773105.84833: building list of next tasks for hosts 8119 1726773105.84836: getting the next task for host managed_node2 8119 1726773105.84842: done getting next task for host managed_node2 8119 1726773105.84846: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773105.84851: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.84854: done building task lists 8119 1726773105.84856: counting tasks in each state of execution 8119 1726773105.84859: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773105.84861: advancing hosts in ITERATING_TASKS 8119 1726773105.84863: starting to advance hosts 8119 1726773105.84866: getting the next task for host managed_node2 8119 1726773105.84870: done getting next task for host managed_node2 8119 1726773105.84873: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8119 1726773105.84876: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.84879: done advancing hosts to next task 8119 1726773105.84892: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773105.84896: getting variables 8119 1726773105.84899: in VariableManager get_vars() 8119 1726773105.84928: Calling all_inventory to load vars for managed_node2 8119 1726773105.84933: Calling groups_inventory to load vars for managed_node2 8119 1726773105.84937: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773105.84960: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.84970: Calling all_plugins_play to load vars for managed_node2 8119 1726773105.84980: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.84993: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773105.85005: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.85014: Calling groups_plugins_play to load vars for managed_node2 8119 1726773105.85024: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.85042: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.85055: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.85253: done with get_vars() 8119 1726773105.85263: done getting variables 8119 1726773105.85268: sending task start callback, copying the task so we can template it temporarily 8119 1726773105.85269: done copying, going to template now 8119 1726773105.85271: done templating 8119 1726773105.85272: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:45 -0400 (0:00:00.730) 0:01:40.409 **** 8119 1726773105.85290: sending task start callback 8119 1726773105.85292: entering _queue_task() for managed_node2/service 8119 1726773105.85416: worker is 1 (out of 1 available) 8119 1726773105.85454: exiting _queue_task() for managed_node2/service 8119 1726773105.85530: done queuing things up, now waiting for results queue to drain 8119 1726773105.85536: waiting for pending results... 12016 1726773105.85592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 12016 1726773105.85646: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6b 12016 1726773105.87369: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12016 1726773105.87455: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12016 1726773105.87526: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12016 1726773105.87554: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12016 1726773105.87584: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12016 1726773105.87617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12016 1726773105.87663: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12016 1726773105.87688: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12016 1726773105.87706: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12016 1726773105.87789: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12016 1726773105.87808: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12016 1726773105.87823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12016 1726773105.87982: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12016 1726773105.87988: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12016 1726773105.87990: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12016 1726773105.87992: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12016 1726773105.87994: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12016 1726773105.87996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.87998: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12016 1726773105.87999: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12016 1726773105.88001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12016 1726773105.88017: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12016 1726773105.88019: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12016 1726773105.88021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.88175: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12016 1726773105.88179: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12016 1726773105.88181: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12016 1726773105.88187: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12016 1726773105.88190: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12016 1726773105.88193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.88195: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12016 1726773105.88197: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12016 1726773105.88199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12016 1726773105.88219: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12016 1726773105.88222: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12016 1726773105.88223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.88428: when evaluation is False, skipping this task 12016 1726773105.88464: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12016 1726773105.88468: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12016 1726773105.88470: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12016 1726773105.88472: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12016 1726773105.88474: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12016 1726773105.88476: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.88477: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12016 1726773105.88479: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12016 1726773105.88481: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12016 1726773105.88506: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12016 1726773105.88510: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12016 1726773105.88513: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12016 1726773105.88585: dumping result to json 12016 1726773105.88614: done dumping result, returning 12016 1726773105.88622: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [12a3200b-1e9d-1dbd-cc52-000000000e6b] 12016 1726773105.88630: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6b 12016 1726773105.88635: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6b 12016 1726773105.88637: WORKER PROCESS EXITING skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "item": "tuned", "skip_reason": "Conditional result was False" } 8119 1726773105.88794: no more pending results, returning what we have 8119 1726773105.88801: results queue empty 8119 1726773105.88803: checking for any_errors_fatal 8119 1726773105.88813: done checking for any_errors_fatal 8119 1726773105.88815: checking for max_fail_percentage 8119 1726773105.88818: done checking for max_fail_percentage 8119 1726773105.88820: checking to see if all hosts have failed and the running result is not ok 8119 1726773105.88822: done checking to see if all hosts have failed 8119 1726773105.88824: getting the remaining hosts for this loop 8119 1726773105.88827: done getting the remaining hosts for this loop 8119 1726773105.88834: building list of next tasks for hosts 8119 1726773105.88836: getting the next task for host managed_node2 8119 1726773105.88845: done getting next task for host managed_node2 8119 1726773105.88849: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773105.88854: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.88857: done building task lists 8119 1726773105.88858: counting tasks in each state of execution 8119 1726773105.88862: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773105.88864: advancing hosts in ITERATING_TASKS 8119 1726773105.88866: starting to advance hosts 8119 1726773105.88869: getting the next task for host managed_node2 8119 1726773105.88873: done getting next task for host managed_node2 8119 1726773105.88876: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8119 1726773105.88880: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773105.88884: done advancing hosts to next task 8119 1726773105.88899: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773105.88903: getting variables 8119 1726773105.88907: in VariableManager get_vars() 8119 1726773105.88935: Calling all_inventory to load vars for managed_node2 8119 1726773105.88938: Calling groups_inventory to load vars for managed_node2 8119 1726773105.88940: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773105.88960: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.88970: Calling all_plugins_play to load vars for managed_node2 8119 1726773105.88980: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.88991: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773105.89002: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.89008: Calling groups_plugins_play to load vars for managed_node2 8119 1726773105.89018: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.89036: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.89049: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773105.89267: done with get_vars() 8119 1726773105.89281: done getting variables 8119 1726773105.89292: sending task start callback, copying the task so we can template it temporarily 8119 1726773105.89294: done copying, going to template now 8119 1726773105.89297: done templating 8119 1726773105.89300: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:45 -0400 (0:00:00.040) 0:01:40.449 **** 8119 1726773105.89324: sending task start callback 8119 1726773105.89326: entering _queue_task() for managed_node2/command 8119 1726773105.89474: worker is 1 (out of 1 available) 8119 1726773105.89510: exiting _queue_task() for managed_node2/command 8119 1726773105.89584: done queuing things up, now waiting for results queue to drain 8119 1726773105.89589: waiting for pending results... 12020 1726773105.89808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 12020 1726773105.89881: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6c 12020 1726773105.89937: calling self._execute() 12020 1726773105.91785: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12020 1726773105.91875: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12020 1726773105.91934: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12020 1726773105.91965: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12020 1726773105.91994: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12020 1726773105.92028: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12020 1726773105.92074: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12020 1726773105.92100: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12020 1726773105.92118: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12020 1726773105.92197: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12020 1726773105.92223: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12020 1726773105.92241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12020 1726773105.92908: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12020 1726773105.92943: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12020 1726773105.92953: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12020 1726773105.92962: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12020 1726773105.92967: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12020 1726773105.93059: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12020 1726773105.93067: starting attempt loop 12020 1726773105.93070: running the handler 12020 1726773105.93078: _low_level_execute_command(): starting 12020 1726773105.93081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12020 1726773105.95577: stdout chunk (state=2): >>>/root <<< 12020 1726773105.95699: stderr chunk (state=3): >>><<< 12020 1726773105.95705: stdout chunk (state=3): >>><<< 12020 1726773105.95727: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12020 1726773105.95740: _low_level_execute_command(): starting 12020 1726773105.95747: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803 `" && echo ansible-tmp-1726773105.9573486-12020-229760654013803="` echo /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803 `" ) && sleep 0' 12020 1726773105.98557: stdout chunk (state=2): >>>ansible-tmp-1726773105.9573486-12020-229760654013803=/root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803 <<< 12020 1726773105.98685: stderr chunk (state=3): >>><<< 12020 1726773105.98691: stdout chunk (state=3): >>><<< 12020 1726773105.98715: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773105.9573486-12020-229760654013803=/root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803 , stderr= 12020 1726773105.98827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 12020 1726773105.98878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/AnsiballZ_command.py 12020 1726773105.99213: Sending initial data 12020 1726773105.99227: Sent initial data (155 bytes) 12020 1726773106.01669: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp6wy1wo00 /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/AnsiballZ_command.py <<< 12020 1726773106.02648: stderr chunk (state=3): >>><<< 12020 1726773106.02653: stdout chunk (state=3): >>><<< 12020 1726773106.02675: done transferring module to remote 12020 1726773106.02690: _low_level_execute_command(): starting 12020 1726773106.02696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/ /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/AnsiballZ_command.py && sleep 0' 12020 1726773106.05220: stderr chunk (state=2): >>><<< 12020 1726773106.05231: stdout chunk (state=2): >>><<< 12020 1726773106.05248: _low_level_execute_command() done: rc=0, stdout=, stderr= 12020 1726773106.05251: _low_level_execute_command(): starting 12020 1726773106.05257: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/AnsiballZ_command.py && sleep 0' 12020 1726773107.34942: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:46.195643", "end": "2024-09-19 15:11:47.343767", "delta": "0:00:01.148124", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12020 1726773107.35739: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12020 1726773107.35791: stderr chunk (state=3): >>><<< 12020 1726773107.35797: stdout chunk (state=3): >>><<< 12020 1726773107.35820: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "stdout": "", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:46.195643", "end": "2024-09-19 15:11:47.343767", "delta": "0:00:01.148124", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12020 1726773107.35852: done with _execute_module (command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12020 1726773107.35864: _low_level_execute_command(): starting 12020 1726773107.35871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773105.9573486-12020-229760654013803/ > /dev/null 2>&1 && sleep 0' 12020 1726773107.38517: stderr chunk (state=2): >>><<< 12020 1726773107.38529: stdout chunk (state=2): >>><<< 12020 1726773107.38550: _low_level_execute_command() done: rc=0, stdout=, stderr= 12020 1726773107.38558: handler run complete 12020 1726773107.38567: attempt loop complete, returning result 12020 1726773107.38580: _execute() done 12020 1726773107.38581: dumping result to json 12020 1726773107.38588: done dumping result, returning 12020 1726773107.38601: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [12a3200b-1e9d-1dbd-cc52-000000000e6c] 12020 1726773107.38616: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6c 12020 1726773107.38653: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6c 12020 1726773107.38657: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.148124", "end": "2024-09-19 15:11:47.343767", "rc": 0, "start": "2024-09-19 15:11:46.195643" } 8119 1726773107.38885: no more pending results, returning what we have 8119 1726773107.38891: results queue empty 8119 1726773107.38893: checking for any_errors_fatal 8119 1726773107.38898: done checking for any_errors_fatal 8119 1726773107.38900: checking for max_fail_percentage 8119 1726773107.38903: done checking for max_fail_percentage 8119 1726773107.38904: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.38906: done checking to see if all hosts have failed 8119 1726773107.38911: getting the remaining hosts for this loop 8119 1726773107.38913: done getting the remaining hosts for this loop 8119 1726773107.38921: building list of next tasks for hosts 8119 1726773107.38923: getting the next task for host managed_node2 8119 1726773107.38930: done getting next task for host managed_node2 8119 1726773107.38933: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773107.38936: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.38938: done building task lists 8119 1726773107.38939: counting tasks in each state of execution 8119 1726773107.38942: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.38944: advancing hosts in ITERATING_TASKS 8119 1726773107.38945: starting to advance hosts 8119 1726773107.38946: getting the next task for host managed_node2 8119 1726773107.38949: done getting next task for host managed_node2 8119 1726773107.38951: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8119 1726773107.38954: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.38955: done advancing hosts to next task 8119 1726773107.38967: getting variables 8119 1726773107.38970: in VariableManager get_vars() 8119 1726773107.38998: Calling all_inventory to load vars for managed_node2 8119 1726773107.39002: Calling groups_inventory to load vars for managed_node2 8119 1726773107.39005: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.39028: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39039: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.39049: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39057: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.39067: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39073: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.39082: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39106: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39125: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.39323: done with get_vars() 8119 1726773107.39333: done getting variables 8119 1726773107.39338: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.39340: done copying, going to template now 8119 1726773107.39341: done templating 8119 1726773107.39343: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:47 -0400 (0:00:01.500) 0:01:41.950 **** 8119 1726773107.39359: sending task start callback 8119 1726773107.39361: entering _queue_task() for managed_node2/include_tasks 8119 1726773107.39480: worker is 1 (out of 1 available) 8119 1726773107.39522: exiting _queue_task() for managed_node2/include_tasks 8119 1726773107.39592: done queuing things up, now waiting for results queue to drain 8119 1726773107.39597: waiting for pending results... 12033 1726773107.39655: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 12033 1726773107.39713: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6d 12033 1726773107.39759: calling self._execute() 12033 1726773107.41522: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12033 1726773107.41611: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12033 1726773107.41666: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12033 1726773107.41696: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12033 1726773107.41727: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12033 1726773107.41755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726773107.41801: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12033 1726773107.41838: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12033 1726773107.41856: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726773107.41936: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12033 1726773107.41954: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12033 1726773107.41968: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12033 1726773107.42225: _execute() done 12033 1726773107.42229: dumping result to json 12033 1726773107.42231: done dumping result, returning 12033 1726773107.42235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [12a3200b-1e9d-1dbd-cc52-000000000e6d] 12033 1726773107.42245: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6d 12033 1726773107.42271: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6d 12033 1726773107.42276: WORKER PROCESS EXITING 8119 1726773107.42454: no more pending results, returning what we have 8119 1726773107.42464: in VariableManager get_vars() 8119 1726773107.42507: Calling all_inventory to load vars for managed_node2 8119 1726773107.42516: Calling groups_inventory to load vars for managed_node2 8119 1726773107.42520: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.42545: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42556: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.42567: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42576: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.42589: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42602: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.42615: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42634: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42647: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.42855: done with get_vars() 8119 1726773107.42895: we have included files to process 8119 1726773107.42898: generating all_blocks data 8119 1726773107.42917: done generating all_blocks data 8119 1726773107.42920: processing included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773107.42921: loading included file: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8119 1726773107.42924: Loading data from /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8119 1726773107.43134: done processing included file 8119 1726773107.43136: iterating over new_blocks loaded from include file 8119 1726773107.43138: in VariableManager get_vars() 8119 1726773107.43158: done with get_vars() 8119 1726773107.43160: filtering new block on tags 8119 1726773107.43229: done filtering new block on tags 8119 1726773107.43239: done iterating over new_blocks loaded from include file 8119 1726773107.43241: extending task lists for all hosts with included blocks 8119 1726773107.43704: done extending task lists 8119 1726773107.43709: done processing included files 8119 1726773107.43711: results queue empty 8119 1726773107.43713: checking for any_errors_fatal 8119 1726773107.43716: done checking for any_errors_fatal 8119 1726773107.43718: checking for max_fail_percentage 8119 1726773107.43719: done checking for max_fail_percentage 8119 1726773107.43720: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.43722: done checking to see if all hosts have failed 8119 1726773107.43723: getting the remaining hosts for this loop 8119 1726773107.43725: done getting the remaining hosts for this loop 8119 1726773107.43728: building list of next tasks for hosts 8119 1726773107.43730: getting the next task for host managed_node2 8119 1726773107.43735: done getting next task for host managed_node2 8119 1726773107.43737: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773107.43741: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.43743: done building task lists 8119 1726773107.43744: counting tasks in each state of execution 8119 1726773107.43746: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.43748: advancing hosts in ITERATING_TASKS 8119 1726773107.43749: starting to advance hosts 8119 1726773107.43751: getting the next task for host managed_node2 8119 1726773107.43754: done getting next task for host managed_node2 8119 1726773107.43756: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8119 1726773107.43758: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.43759: done advancing hosts to next task 8119 1726773107.43765: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773107.43767: getting variables 8119 1726773107.43768: in VariableManager get_vars() 8119 1726773107.43781: Calling all_inventory to load vars for managed_node2 8119 1726773107.43787: Calling groups_inventory to load vars for managed_node2 8119 1726773107.43789: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.43803: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.43812: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.43823: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.43831: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.43841: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.43847: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.43856: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.43873: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.43889: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.44066: done with get_vars() 8119 1726773107.44075: done getting variables 8119 1726773107.44080: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.44081: done copying, going to template now 8119 1726773107.44085: done templating 8119 1726773107.44086: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.047) 0:01:41.997 **** 8119 1726773107.44102: sending task start callback 8119 1726773107.44103: entering _queue_task() for managed_node2/command 8119 1726773107.44241: worker is 1 (out of 1 available) 8119 1726773107.44278: exiting _queue_task() for managed_node2/command 8119 1726773107.44351: done queuing things up, now waiting for results queue to drain 8119 1726773107.44356: waiting for pending results... 12035 1726773107.44418: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 12035 1726773107.44477: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000f5c 12035 1726773107.44525: calling self._execute() 12035 1726773107.44703: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12035 1726773107.44748: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12035 1726773107.44760: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12035 1726773107.44770: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12035 1726773107.44776: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12035 1726773107.44897: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12035 1726773107.44920: starting attempt loop 12035 1726773107.44923: running the handler 12035 1726773107.44933: _low_level_execute_command(): starting 12035 1726773107.44937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12035 1726773107.47448: stdout chunk (state=2): >>>/root <<< 12035 1726773107.47567: stderr chunk (state=3): >>><<< 12035 1726773107.47573: stdout chunk (state=3): >>><<< 12035 1726773107.47596: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12035 1726773107.47614: _low_level_execute_command(): starting 12035 1726773107.47620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704 `" && echo ansible-tmp-1726773107.4760487-12035-216678330224704="` echo /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704 `" ) && sleep 0' 12035 1726773107.50322: stdout chunk (state=2): >>>ansible-tmp-1726773107.4760487-12035-216678330224704=/root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704 <<< 12035 1726773107.50474: stderr chunk (state=3): >>><<< 12035 1726773107.50480: stdout chunk (state=3): >>><<< 12035 1726773107.50504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773107.4760487-12035-216678330224704=/root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704 , stderr= 12035 1726773107.50633: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 12035 1726773107.50692: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/AnsiballZ_command.py 12035 1726773107.51022: Sending initial data 12035 1726773107.51037: Sent initial data (155 bytes) 12035 1726773107.53477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmph_7nt6is /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/AnsiballZ_command.py <<< 12035 1726773107.54449: stderr chunk (state=3): >>><<< 12035 1726773107.54454: stdout chunk (state=3): >>><<< 12035 1726773107.54475: done transferring module to remote 12035 1726773107.54489: _low_level_execute_command(): starting 12035 1726773107.54495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/ /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/AnsiballZ_command.py && sleep 0' 12035 1726773107.57010: stderr chunk (state=2): >>><<< 12035 1726773107.57020: stdout chunk (state=2): >>><<< 12035 1726773107.57038: _low_level_execute_command() done: rc=0, stdout=, stderr= 12035 1726773107.57041: _low_level_execute_command(): starting 12035 1726773107.57048: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/AnsiballZ_command.py && sleep 0' 12035 1726773107.82541: stdout chunk (state=2): >>> {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:47.714702", "end": "2024-09-19 15:11:47.820662", "delta": "0:00:00.105960", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12035 1726773107.83527: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12035 1726773107.83572: stderr chunk (state=3): >>><<< 12035 1726773107.83578: stdout chunk (state=3): >>><<< 12035 1726773107.83604: _low_level_execute_command() done: rc=0, stdout= {"cmd": ["tuned-adm", "verify", "-i"], "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "start": "2024-09-19 15:11:47.714702", "end": "2024-09-19 15:11:47.820662", "delta": "0:00:00.105960", "changed": true, "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "warn": true, "_uses_shell": false, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12035 1726773107.83642: done with _execute_module (command, {'_raw_params': 'tuned-adm verify -i', 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12035 1726773107.83653: _low_level_execute_command(): starting 12035 1726773107.83658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773107.4760487-12035-216678330224704/ > /dev/null 2>&1 && sleep 0' 12035 1726773107.86360: stderr chunk (state=2): >>><<< 12035 1726773107.86374: stdout chunk (state=2): >>><<< 12035 1726773107.86396: _low_level_execute_command() done: rc=0, stdout=, stderr= 12035 1726773107.86403: handler run complete 12035 1726773107.86443: attempt loop complete, returning result 12035 1726773107.86456: _execute() done 12035 1726773107.86458: dumping result to json 12035 1726773107.86461: done dumping result, returning 12035 1726773107.86473: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [12a3200b-1e9d-1dbd-cc52-000000000f5c] 12035 1726773107.86491: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5c 12035 1726773107.86532: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5c 12035 1726773107.86536: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.105960", "end": "2024-09-19 15:11:47.820662", "rc": 0, "start": "2024-09-19 15:11:47.714702" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8119 1726773107.86811: no more pending results, returning what we have 8119 1726773107.86815: results queue empty 8119 1726773107.86816: checking for any_errors_fatal 8119 1726773107.86818: done checking for any_errors_fatal 8119 1726773107.86820: checking for max_fail_percentage 8119 1726773107.86822: done checking for max_fail_percentage 8119 1726773107.86823: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.86825: done checking to see if all hosts have failed 8119 1726773107.86826: getting the remaining hosts for this loop 8119 1726773107.86828: done getting the remaining hosts for this loop 8119 1726773107.86833: building list of next tasks for hosts 8119 1726773107.86835: getting the next task for host managed_node2 8119 1726773107.86841: done getting next task for host managed_node2 8119 1726773107.86844: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773107.86848: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.86851: done building task lists 8119 1726773107.86852: counting tasks in each state of execution 8119 1726773107.86855: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.86856: advancing hosts in ITERATING_TASKS 8119 1726773107.86858: starting to advance hosts 8119 1726773107.86859: getting the next task for host managed_node2 8119 1726773107.86863: done getting next task for host managed_node2 8119 1726773107.86865: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8119 1726773107.86867: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.86869: done advancing hosts to next task 8119 1726773107.86880: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773107.86885: getting variables 8119 1726773107.86888: in VariableManager get_vars() 8119 1726773107.86915: Calling all_inventory to load vars for managed_node2 8119 1726773107.86918: Calling groups_inventory to load vars for managed_node2 8119 1726773107.86921: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.86943: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.86953: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.86963: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.86971: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.86986: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.86994: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.87004: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.87023: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.87037: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.87239: done with get_vars() 8119 1726773107.87250: done getting variables 8119 1726773107.87255: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.87257: done copying, going to template now 8119 1726773107.87259: done templating 8119 1726773107.87260: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.431) 0:01:42.429 **** 8119 1726773107.87276: sending task start callback 8119 1726773107.87278: entering _queue_task() for managed_node2/shell 8119 1726773107.87401: worker is 1 (out of 1 available) 8119 1726773107.87439: exiting _queue_task() for managed_node2/shell 8119 1726773107.87510: done queuing things up, now waiting for results queue to drain 8119 1726773107.87516: waiting for pending results... 12044 1726773107.87578: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 12044 1726773107.87644: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000f5d 12044 1726773107.87688: calling self._execute() 12044 1726773107.89492: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12044 1726773107.89581: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12044 1726773107.89645: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12044 1726773107.89673: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12044 1726773107.89702: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12044 1726773107.89744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12044 1726773107.89794: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12044 1726773107.89818: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12044 1726773107.89835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12044 1726773107.89914: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12044 1726773107.89932: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12044 1726773107.89947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12044 1726773107.90199: when evaluation is False, skipping this task 12044 1726773107.90204: _execute() done 12044 1726773107.90206: dumping result to json 12044 1726773107.90208: done dumping result, returning 12044 1726773107.90213: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [12a3200b-1e9d-1dbd-cc52-000000000f5d] 12044 1726773107.90221: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5d 12044 1726773107.90247: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5d 12044 1726773107.90251: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773107.90403: no more pending results, returning what we have 8119 1726773107.90411: results queue empty 8119 1726773107.90413: checking for any_errors_fatal 8119 1726773107.90418: done checking for any_errors_fatal 8119 1726773107.90420: checking for max_fail_percentage 8119 1726773107.90424: done checking for max_fail_percentage 8119 1726773107.90426: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.90428: done checking to see if all hosts have failed 8119 1726773107.90430: getting the remaining hosts for this loop 8119 1726773107.90432: done getting the remaining hosts for this loop 8119 1726773107.90440: building list of next tasks for hosts 8119 1726773107.90442: getting the next task for host managed_node2 8119 1726773107.90452: done getting next task for host managed_node2 8119 1726773107.90457: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773107.90463: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.90466: done building task lists 8119 1726773107.90468: counting tasks in each state of execution 8119 1726773107.90471: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.90474: advancing hosts in ITERATING_TASKS 8119 1726773107.90476: starting to advance hosts 8119 1726773107.90478: getting the next task for host managed_node2 8119 1726773107.90485: done getting next task for host managed_node2 8119 1726773107.90488: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8119 1726773107.90493: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.90495: done advancing hosts to next task 8119 1726773107.90512: Loading ActionModule 'fail' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773107.90517: getting variables 8119 1726773107.90520: in VariableManager get_vars() 8119 1726773107.90553: Calling all_inventory to load vars for managed_node2 8119 1726773107.90557: Calling groups_inventory to load vars for managed_node2 8119 1726773107.90559: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.90580: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90593: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.90604: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90616: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.90627: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90633: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.90642: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90659: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90672: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.90902: done with get_vars() 8119 1726773107.90915: done getting variables 8119 1726773107.90920: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.90922: done copying, going to template now 8119 1726773107.90924: done templating 8119 1726773107.90925: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.036) 0:01:42.465 **** 8119 1726773107.90942: sending task start callback 8119 1726773107.90943: entering _queue_task() for managed_node2/fail 8119 1726773107.91088: worker is 1 (out of 1 available) 8119 1726773107.91124: exiting _queue_task() for managed_node2/fail 8119 1726773107.91200: done queuing things up, now waiting for results queue to drain 8119 1726773107.91205: waiting for pending results... 12048 1726773107.91432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 12048 1726773107.91524: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000f5e 12048 1726773107.91579: calling self._execute() 12048 1726773107.93460: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12048 1726773107.93548: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12048 1726773107.93616: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12048 1726773107.93644: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12048 1726773107.93674: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12048 1726773107.93706: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12048 1726773107.93754: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12048 1726773107.93779: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12048 1726773107.93806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12048 1726773107.93881: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12048 1726773107.93904: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12048 1726773107.93922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12048 1726773107.94189: when evaluation is False, skipping this task 12048 1726773107.94193: _execute() done 12048 1726773107.94195: dumping result to json 12048 1726773107.94197: done dumping result, returning 12048 1726773107.94201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [12a3200b-1e9d-1dbd-cc52-000000000f5e] 12048 1726773107.94210: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5e 12048 1726773107.94237: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000f5e 12048 1726773107.94242: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } 8119 1726773107.94468: no more pending results, returning what we have 8119 1726773107.94473: results queue empty 8119 1726773107.94475: checking for any_errors_fatal 8119 1726773107.94480: done checking for any_errors_fatal 8119 1726773107.94482: checking for max_fail_percentage 8119 1726773107.94487: done checking for max_fail_percentage 8119 1726773107.94489: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.94491: done checking to see if all hosts have failed 8119 1726773107.94493: getting the remaining hosts for this loop 8119 1726773107.94496: done getting the remaining hosts for this loop 8119 1726773107.94504: building list of next tasks for hosts 8119 1726773107.94507: getting the next task for host managed_node2 8119 1726773107.94519: done getting next task for host managed_node2 8119 1726773107.94524: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773107.94529: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.94532: done building task lists 8119 1726773107.94533: counting tasks in each state of execution 8119 1726773107.94536: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.94537: advancing hosts in ITERATING_TASKS 8119 1726773107.94539: starting to advance hosts 8119 1726773107.94540: getting the next task for host managed_node2 8119 1726773107.94544: done getting next task for host managed_node2 8119 1726773107.94546: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8119 1726773107.94548: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.94550: done advancing hosts to next task 8119 1726773107.94561: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773107.94564: getting variables 8119 1726773107.94567: in VariableManager get_vars() 8119 1726773107.94596: Calling all_inventory to load vars for managed_node2 8119 1726773107.94600: Calling groups_inventory to load vars for managed_node2 8119 1726773107.94602: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.94626: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94637: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.94648: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94656: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.94666: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94672: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.94681: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94702: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94719: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.94928: done with get_vars() 8119 1726773107.94939: done getting variables 8119 1726773107.94944: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.94946: done copying, going to template now 8119 1726773107.94947: done templating 8119 1726773107.94949: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.040) 0:01:42.506 **** 8119 1726773107.94965: sending task start callback 8119 1726773107.94967: entering _queue_task() for managed_node2/set_fact 8119 1726773107.95096: worker is 1 (out of 1 available) 8119 1726773107.95137: exiting _queue_task() for managed_node2/set_fact 8119 1726773107.95213: done queuing things up, now waiting for results queue to drain 8119 1726773107.95219: waiting for pending results... 12051 1726773107.95274: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 12051 1726773107.95327: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6e 12051 1726773107.95372: calling self._execute() 12051 1726773107.95533: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12051 1726773107.95572: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12051 1726773107.95587: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12051 1726773107.95599: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12051 1726773107.95607: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12051 1726773107.95728: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12051 1726773107.95750: starting attempt loop 12051 1726773107.95752: running the handler 12051 1726773107.95770: handler run complete 12051 1726773107.95773: attempt loop complete, returning result 12051 1726773107.95775: _execute() done 12051 1726773107.95776: dumping result to json 12051 1726773107.95778: done dumping result, returning 12051 1726773107.95782: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [12a3200b-1e9d-1dbd-cc52-000000000e6e] 12051 1726773107.95792: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6e 12051 1726773107.95821: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6e 12051 1726773107.95824: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8119 1726773107.96016: no more pending results, returning what we have 8119 1726773107.96020: results queue empty 8119 1726773107.96021: checking for any_errors_fatal 8119 1726773107.96025: done checking for any_errors_fatal 8119 1726773107.96026: checking for max_fail_percentage 8119 1726773107.96028: done checking for max_fail_percentage 8119 1726773107.96030: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.96031: done checking to see if all hosts have failed 8119 1726773107.96032: getting the remaining hosts for this loop 8119 1726773107.96034: done getting the remaining hosts for this loop 8119 1726773107.96038: building list of next tasks for hosts 8119 1726773107.96040: getting the next task for host managed_node2 8119 1726773107.96046: done getting next task for host managed_node2 8119 1726773107.96048: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773107.96052: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.96054: done building task lists 8119 1726773107.96055: counting tasks in each state of execution 8119 1726773107.96058: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773107.96059: advancing hosts in ITERATING_TASKS 8119 1726773107.96061: starting to advance hosts 8119 1726773107.96063: getting the next task for host managed_node2 8119 1726773107.96066: done getting next task for host managed_node2 8119 1726773107.96068: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8119 1726773107.96070: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773107.96072: done advancing hosts to next task 8119 1726773107.96082: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773107.96088: getting variables 8119 1726773107.96090: in VariableManager get_vars() 8119 1726773107.96116: Calling all_inventory to load vars for managed_node2 8119 1726773107.96119: Calling groups_inventory to load vars for managed_node2 8119 1726773107.96122: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773107.96141: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96150: Calling all_plugins_play to load vars for managed_node2 8119 1726773107.96160: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96168: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773107.96178: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96186: Calling groups_plugins_play to load vars for managed_node2 8119 1726773107.96196: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96215: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96229: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773107.96449: done with get_vars() 8119 1726773107.96459: done getting variables 8119 1726773107.96464: sending task start callback, copying the task so we can template it temporarily 8119 1726773107.96465: done copying, going to template now 8119 1726773107.96467: done templating 8119 1726773107.96468: here goes the callback... TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.015) 0:01:42.521 **** 8119 1726773107.96486: sending task start callback 8119 1726773107.96489: entering _queue_task() for managed_node2/set_fact 8119 1726773107.96601: worker is 1 (out of 1 available) 8119 1726773107.96637: exiting _queue_task() for managed_node2/set_fact 8119 1726773107.96705: done queuing things up, now waiting for results queue to drain 8119 1726773107.96710: waiting for pending results... 12053 1726773107.96772: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 12053 1726773107.96825: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000e6f 12053 1726773107.96870: calling self._execute() 12053 1726773107.98650: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12053 1726773107.98740: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12053 1726773107.98913: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12053 1726773107.98941: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12053 1726773107.98967: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12053 1726773107.98996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12053 1726773107.99045: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12053 1726773107.99067: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12053 1726773107.99085: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12053 1726773107.99167: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12053 1726773107.99186: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12053 1726773107.99202: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12053 1726773107.99437: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12053 1726773107.99441: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12053 1726773107.99444: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12053 1726773107.99446: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12053 1726773107.99447: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12053 1726773107.99449: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12053 1726773107.99451: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12053 1726773107.99453: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12053 1726773107.99455: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12053 1726773107.99469: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12053 1726773107.99471: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12053 1726773107.99473: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12053 1726773107.99530: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12053 1726773107.99565: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12053 1726773107.99575: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12053 1726773107.99587: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12053 1726773107.99593: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12053 1726773107.99692: Loading ActionModule 'set_fact' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12053 1726773107.99700: starting attempt loop 12053 1726773107.99703: running the handler 12053 1726773107.99714: handler run complete 12053 1726773107.99717: attempt loop complete, returning result 12053 1726773107.99719: _execute() done 12053 1726773107.99721: dumping result to json 12053 1726773107.99723: done dumping result, returning 12053 1726773107.99728: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [12a3200b-1e9d-1dbd-cc52-000000000e6f] 12053 1726773107.99734: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6f 12053 1726773107.99773: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000e6f 12053 1726773107.99777: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8119 1726773107.99950: no more pending results, returning what we have 8119 1726773107.99954: results queue empty 8119 1726773107.99956: checking for any_errors_fatal 8119 1726773107.99961: done checking for any_errors_fatal 8119 1726773107.99963: checking for max_fail_percentage 8119 1726773107.99966: done checking for max_fail_percentage 8119 1726773107.99968: checking to see if all hosts have failed and the running result is not ok 8119 1726773107.99970: done checking to see if all hosts have failed 8119 1726773107.99972: getting the remaining hosts for this loop 8119 1726773107.99975: done getting the remaining hosts for this loop 8119 1726773107.99982: building list of next tasks for hosts 8119 1726773107.99987: getting the next task for host managed_node2 8119 1726773107.99997: done getting next task for host managed_node2 8119 1726773108.00000: ^ task is: TASK: Verify no settings 8119 1726773108.00004: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.00006: done building task lists 8119 1726773108.00008: counting tasks in each state of execution 8119 1726773108.00012: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773108.00014: advancing hosts in ITERATING_TASKS 8119 1726773108.00016: starting to advance hosts 8119 1726773108.00018: getting the next task for host managed_node2 8119 1726773108.00024: done getting next task for host managed_node2 8119 1726773108.00026: ^ task is: TASK: Verify no settings 8119 1726773108.00029: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.00031: done advancing hosts to next task 8119 1726773108.00046: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773108.00050: getting variables 8119 1726773108.00053: in VariableManager get_vars() 8119 1726773108.00087: Calling all_inventory to load vars for managed_node2 8119 1726773108.00094: Calling groups_inventory to load vars for managed_node2 8119 1726773108.00097: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773108.00120: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00131: Calling all_plugins_play to load vars for managed_node2 8119 1726773108.00141: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00150: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773108.00160: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00166: Calling groups_plugins_play to load vars for managed_node2 8119 1726773108.00175: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00195: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00210: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.00410: done with get_vars() 8119 1726773108.00421: done getting variables 8119 1726773108.00425: sending task start callback, copying the task so we can template it temporarily 8119 1726773108.00427: done copying, going to template now 8119 1726773108.00429: done templating 8119 1726773108.00430: here goes the callback... TASK [Verify no settings] ****************************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.039) 0:01:42.560 **** 8119 1726773108.00446: sending task start callback 8119 1726773108.00448: entering _queue_task() for managed_node2/shell 8119 1726773108.00576: worker is 1 (out of 1 available) 8119 1726773108.00617: exiting _queue_task() for managed_node2/shell 8119 1726773108.00691: done queuing things up, now waiting for results queue to drain 8119 1726773108.00697: waiting for pending results... 12055 1726773108.00758: running TaskExecutor() for managed_node2/TASK: Verify no settings 12055 1726773108.00806: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4a 12055 1726773108.00855: calling self._execute() 12055 1726773108.02754: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12055 1726773108.02855: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12055 1726773108.02912: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12055 1726773108.02942: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12055 1726773108.02968: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12055 1726773108.02997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12055 1726773108.03049: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12055 1726773108.03073: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12055 1726773108.03094: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12055 1726773108.03185: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12055 1726773108.03204: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12055 1726773108.03222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12055 1726773108.03515: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12055 1726773108.03548: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12055 1726773108.03558: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12055 1726773108.03568: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12055 1726773108.03573: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12055 1726773108.03669: Loading ActionModule 'shell' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12055 1726773108.03678: starting attempt loop 12055 1726773108.03680: running the handler 12055 1726773108.03687: Loading ActionModule 'command' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12055 1726773108.03698: _low_level_execute_command(): starting 12055 1726773108.03702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12055 1726773108.06227: stdout chunk (state=2): >>>/root <<< 12055 1726773108.06348: stderr chunk (state=3): >>><<< 12055 1726773108.06354: stdout chunk (state=3): >>><<< 12055 1726773108.06376: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12055 1726773108.06393: _low_level_execute_command(): starting 12055 1726773108.06401: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613 `" && echo ansible-tmp-1726773108.063875-12055-205947048859613="` echo /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613 `" ) && sleep 0' 12055 1726773108.09267: stdout chunk (state=2): >>>ansible-tmp-1726773108.063875-12055-205947048859613=/root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613 <<< 12055 1726773108.09397: stderr chunk (state=3): >>><<< 12055 1726773108.09403: stdout chunk (state=3): >>><<< 12055 1726773108.09424: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.063875-12055-205947048859613=/root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613 , stderr= 12055 1726773108.09537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/command-ZIP_DEFLATED 12055 1726773108.09591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/AnsiballZ_command.py 12055 1726773108.09926: Sending initial data 12055 1726773108.09943: Sent initial data (154 bytes) 12055 1726773108.12418: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmptda8wpqq /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/AnsiballZ_command.py <<< 12055 1726773108.13409: stderr chunk (state=3): >>><<< 12055 1726773108.13417: stdout chunk (state=3): >>><<< 12055 1726773108.13446: done transferring module to remote 12055 1726773108.13461: _low_level_execute_command(): starting 12055 1726773108.13465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/ /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/AnsiballZ_command.py && sleep 0' 12055 1726773108.16048: stderr chunk (state=2): >>><<< 12055 1726773108.16061: stdout chunk (state=2): >>><<< 12055 1726773108.16080: _low_level_execute_command() done: rc=0, stdout=, stderr= 12055 1726773108.16086: _low_level_execute_command(): starting 12055 1726773108.16092: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/AnsiballZ_command.py && sleep 0' 12055 1726773108.31672: stdout chunk (state=2): >>> {"cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "start": "2024-09-19 15:11:48.307428", "end": "2024-09-19 15:11:48.314752", "delta": "0:00:00.007324", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12055 1726773108.32739: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12055 1726773108.32792: stderr chunk (state=3): >>><<< 12055 1726773108.32797: stdout chunk (state=3): >>><<< 12055 1726773108.32818: _low_level_execute_command() done: rc=0, stdout= {"cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "start": "2024-09-19 15:11:48.307428", "end": "2024-09-19 15:11:48.314752", "delta": "0:00:00.007324", "changed": true, "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12055 1726773108.32853: done with _execute_module (command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, 'warn': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12055 1726773108.32864: _low_level_execute_command(): starting 12055 1726773108.32868: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.063875-12055-205947048859613/ > /dev/null 2>&1 && sleep 0' 12055 1726773108.35517: stderr chunk (state=2): >>><<< 12055 1726773108.35529: stdout chunk (state=2): >>><<< 12055 1726773108.35548: _low_level_execute_command() done: rc=0, stdout=, stderr= 12055 1726773108.35559: handler run complete 12055 1726773108.35568: attempt loop complete, returning result 12055 1726773108.35581: _execute() done 12055 1726773108.35584: dumping result to json 12055 1726773108.35589: done dumping result, returning 12055 1726773108.35599: done running TaskExecutor() for managed_node2/TASK: Verify no settings [12a3200b-1e9d-1dbd-cc52-000000000c4a] 12055 1726773108.35613: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4a 12055 1726773108.35650: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4a 12055 1726773108.35654: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007324", "end": "2024-09-19 15:11:48.314752", "rc": 0, "start": "2024-09-19 15:11:48.307428" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8119 1726773108.35901: no more pending results, returning what we have 8119 1726773108.35906: results queue empty 8119 1726773108.35908: checking for any_errors_fatal 8119 1726773108.35914: done checking for any_errors_fatal 8119 1726773108.35916: checking for max_fail_percentage 8119 1726773108.35919: done checking for max_fail_percentage 8119 1726773108.35921: checking to see if all hosts have failed and the running result is not ok 8119 1726773108.35923: done checking to see if all hosts have failed 8119 1726773108.35925: getting the remaining hosts for this loop 8119 1726773108.35927: done getting the remaining hosts for this loop 8119 1726773108.35935: building list of next tasks for hosts 8119 1726773108.35938: getting the next task for host managed_node2 8119 1726773108.35943: done getting next task for host managed_node2 8119 1726773108.35945: ^ task is: TASK: Remove kernel_settings tuned profile 8119 1726773108.35948: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=1, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.35950: done building task lists 8119 1726773108.35951: counting tasks in each state of execution 8119 1726773108.35954: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773108.35956: advancing hosts in ITERATING_ALWAYS 8119 1726773108.35957: starting to advance hosts 8119 1726773108.35959: getting the next task for host managed_node2 8119 1726773108.35961: done getting next task for host managed_node2 8119 1726773108.35963: ^ task is: TASK: Remove kernel_settings tuned profile 8119 1726773108.35964: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=1, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.35966: done advancing hosts to next task 8119 1726773108.35976: getting variables 8119 1726773108.35979: in VariableManager get_vars() 8119 1726773108.36006: Calling all_inventory to load vars for managed_node2 8119 1726773108.36012: Calling groups_inventory to load vars for managed_node2 8119 1726773108.36014: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773108.36036: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36046: Calling all_plugins_play to load vars for managed_node2 8119 1726773108.36056: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36065: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773108.36075: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36081: Calling groups_plugins_play to load vars for managed_node2 8119 1726773108.36095: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36115: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36129: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.36318: done with get_vars() 8119 1726773108.36328: done getting variables 8119 1726773108.36332: sending task start callback, copying the task so we can template it temporarily 8119 1726773108.36334: done copying, going to template now 8119 1726773108.36335: done templating 8119 1726773108.36337: here goes the callback... TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.359) 0:01:42.919 **** 8119 1726773108.36352: sending task start callback 8119 1726773108.36353: entering _queue_task() for managed_node2/file 8119 1726773108.36470: worker is 1 (out of 1 available) 8119 1726773108.36510: exiting _queue_task() for managed_node2/file 8119 1726773108.36580: done queuing things up, now waiting for results queue to drain 8119 1726773108.36587: waiting for pending results... 12064 1726773108.36648: running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile 12064 1726773108.36694: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4b 12064 1726773108.36742: calling self._execute() 12064 1726773108.38574: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12064 1726773108.38660: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12064 1726773108.38739: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12064 1726773108.38766: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12064 1726773108.38795: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12064 1726773108.38830: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12064 1726773108.38875: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12064 1726773108.38899: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12064 1726773108.38922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12064 1726773108.39001: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12064 1726773108.39021: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12064 1726773108.39037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12064 1726773108.39278: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12064 1726773108.39316: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12064 1726773108.39327: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12064 1726773108.39338: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12064 1726773108.39343: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12064 1726773108.39430: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 12064 1726773108.39443: plugin lookup for file failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 12064 1726773108.39466: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 12064 1726773108.39481: starting attempt loop 12064 1726773108.39486: running the handler 12064 1726773108.39496: _low_level_execute_command(): starting 12064 1726773108.39501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12064 1726773108.42021: stdout chunk (state=2): >>>/root <<< 12064 1726773108.42136: stderr chunk (state=3): >>><<< 12064 1726773108.42141: stdout chunk (state=3): >>><<< 12064 1726773108.42164: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12064 1726773108.42180: _low_level_execute_command(): starting 12064 1726773108.42187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169 `" && echo ansible-tmp-1726773108.4217424-12064-241072059371169="` echo /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169 `" ) && sleep 0' 12064 1726773108.44963: stdout chunk (state=2): >>>ansible-tmp-1726773108.4217424-12064-241072059371169=/root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169 <<< 12064 1726773108.45094: stderr chunk (state=3): >>><<< 12064 1726773108.45100: stdout chunk (state=3): >>><<< 12064 1726773108.45124: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.4217424-12064-241072059371169=/root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169 , stderr= 12064 1726773108.45215: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/file-ZIP_DEFLATED 12064 1726773108.45273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/AnsiballZ_file.py 12064 1726773108.45613: Sending initial data 12064 1726773108.45627: Sent initial data (152 bytes) 12064 1726773108.48135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpdcze3q9j /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/AnsiballZ_file.py <<< 12064 1726773108.49147: stderr chunk (state=3): >>><<< 12064 1726773108.49154: stdout chunk (state=3): >>><<< 12064 1726773108.49177: done transferring module to remote 12064 1726773108.49192: _low_level_execute_command(): starting 12064 1726773108.49197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/ /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/AnsiballZ_file.py && sleep 0' 12064 1726773108.51815: stderr chunk (state=2): >>><<< 12064 1726773108.51828: stdout chunk (state=2): >>><<< 12064 1726773108.51848: _low_level_execute_command() done: rc=0, stdout=, stderr= 12064 1726773108.51852: _low_level_execute_command(): starting 12064 1726773108.51858: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/AnsiballZ_file.py && sleep 0' 12064 1726773108.66932: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} <<< 12064 1726773108.67937: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12064 1726773108.67990: stderr chunk (state=3): >>><<< 12064 1726773108.67995: stdout chunk (state=3): >>><<< 12064 1726773108.68017: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12064 1726773108.68047: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12064 1726773108.68057: _low_level_execute_command(): starting 12064 1726773108.68065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.4217424-12064-241072059371169/ > /dev/null 2>&1 && sleep 0' 12064 1726773108.70670: stderr chunk (state=2): >>><<< 12064 1726773108.70681: stdout chunk (state=2): >>><<< 12064 1726773108.70703: _low_level_execute_command() done: rc=0, stdout=, stderr= 12064 1726773108.70710: handler run complete 12064 1726773108.70716: attempt loop complete, returning result 12064 1726773108.70730: _execute() done 12064 1726773108.70732: dumping result to json 12064 1726773108.70741: done dumping result, returning 12064 1726773108.70752: done running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile [12a3200b-1e9d-1dbd-cc52-000000000c4b] 12064 1726773108.70766: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4b 12064 1726773108.70804: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4b 12064 1726773108.70808: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8119 1726773108.71079: no more pending results, returning what we have 8119 1726773108.71085: results queue empty 8119 1726773108.71087: checking for any_errors_fatal 8119 1726773108.71091: done checking for any_errors_fatal 8119 1726773108.71093: checking for max_fail_percentage 8119 1726773108.71095: done checking for max_fail_percentage 8119 1726773108.71096: checking to see if all hosts have failed and the running result is not ok 8119 1726773108.71098: done checking to see if all hosts have failed 8119 1726773108.71099: getting the remaining hosts for this loop 8119 1726773108.71101: done getting the remaining hosts for this loop 8119 1726773108.71107: building list of next tasks for hosts 8119 1726773108.71111: getting the next task for host managed_node2 8119 1726773108.71117: done getting next task for host managed_node2 8119 1726773108.71119: ^ task is: TASK: Get active_profile 8119 1726773108.71122: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=2, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.71123: done building task lists 8119 1726773108.71124: counting tasks in each state of execution 8119 1726773108.71127: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773108.71129: advancing hosts in ITERATING_ALWAYS 8119 1726773108.71130: starting to advance hosts 8119 1726773108.71131: getting the next task for host managed_node2 8119 1726773108.71134: done getting next task for host managed_node2 8119 1726773108.71136: ^ task is: TASK: Get active_profile 8119 1726773108.71138: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=2, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773108.71139: done advancing hosts to next task 8119 1726773108.71150: getting variables 8119 1726773108.71152: in VariableManager get_vars() 8119 1726773108.71178: Calling all_inventory to load vars for managed_node2 8119 1726773108.71182: Calling groups_inventory to load vars for managed_node2 8119 1726773108.71187: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773108.71211: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71222: Calling all_plugins_play to load vars for managed_node2 8119 1726773108.71233: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71242: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773108.71252: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71258: Calling groups_plugins_play to load vars for managed_node2 8119 1726773108.71267: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71288: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71303: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773108.71505: done with get_vars() 8119 1726773108.71518: done getting variables 8119 1726773108.71522: sending task start callback, copying the task so we can template it temporarily 8119 1726773108.71524: done copying, going to template now 8119 1726773108.71526: done templating 8119 1726773108.71527: here goes the callback... TASK [Get active_profile] ****************************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.351) 0:01:43.271 **** 8119 1726773108.71542: sending task start callback 8119 1726773108.71544: entering _queue_task() for managed_node2/slurp 8119 1726773108.71668: worker is 1 (out of 1 available) 8119 1726773108.71711: exiting _queue_task() for managed_node2/slurp 8119 1726773108.71781: done queuing things up, now waiting for results queue to drain 8119 1726773108.71788: waiting for pending results... 12073 1726773108.71843: running TaskExecutor() for managed_node2/TASK: Get active_profile 12073 1726773108.71890: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4c 12073 1726773108.71937: calling self._execute() 12073 1726773108.73741: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12073 1726773108.73825: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12073 1726773108.73877: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12073 1726773108.73910: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12073 1726773108.73938: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12073 1726773108.73976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12073 1726773108.74030: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12073 1726773108.74053: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12073 1726773108.74070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12073 1726773108.74147: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12073 1726773108.74164: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12073 1726773108.74177: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12073 1726773108.74391: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12073 1726773108.74425: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12073 1726773108.74435: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12073 1726773108.74445: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12073 1726773108.74449: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12073 1726773108.74531: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 12073 1726773108.74544: plugin lookup for slurp failed; errors: No module named 'ansible_collections.fedora.linux_system_roles.plugins.action' 12073 1726773108.74566: Loading ActionModule 'normal' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) 12073 1726773108.74585: starting attempt loop 12073 1726773108.74588: running the handler 12073 1726773108.74596: _low_level_execute_command(): starting 12073 1726773108.74599: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12073 1726773108.77067: stdout chunk (state=2): >>>/root <<< 12073 1726773108.77190: stderr chunk (state=3): >>><<< 12073 1726773108.77201: stdout chunk (state=3): >>><<< 12073 1726773108.77220: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12073 1726773108.77235: _low_level_execute_command(): starting 12073 1726773108.77240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276 `" && echo ansible-tmp-1726773108.7722867-12073-102569596220276="` echo /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276 `" ) && sleep 0' 12073 1726773108.79984: stdout chunk (state=2): >>>ansible-tmp-1726773108.7722867-12073-102569596220276=/root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276 <<< 12073 1726773108.80116: stderr chunk (state=3): >>><<< 12073 1726773108.80121: stdout chunk (state=3): >>><<< 12073 1726773108.80140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.7722867-12073-102569596220276=/root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276 , stderr= 12073 1726773108.80218: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/slurp-ZIP_DEFLATED 12073 1726773108.80269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/AnsiballZ_slurp.py 12073 1726773108.80582: Sending initial data 12073 1726773108.80600: Sent initial data (153 bytes) 12073 1726773108.83036: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpyhqycega /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/AnsiballZ_slurp.py <<< 12073 1726773108.84012: stderr chunk (state=3): >>><<< 12073 1726773108.84018: stdout chunk (state=3): >>><<< 12073 1726773108.84045: done transferring module to remote 12073 1726773108.84060: _low_level_execute_command(): starting 12073 1726773108.84064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/ /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/AnsiballZ_slurp.py && sleep 0' 12073 1726773108.86630: stderr chunk (state=2): >>><<< 12073 1726773108.86641: stdout chunk (state=2): >>><<< 12073 1726773108.86660: _low_level_execute_command() done: rc=0, stdout=, stderr= 12073 1726773108.86663: _low_level_execute_command(): starting 12073 1726773108.86669: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/AnsiballZ_slurp.py && sleep 0' 12073 1726773109.00952: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 12073 1726773109.01918: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12073 1726773109.01965: stderr chunk (state=3): >>><<< 12073 1726773109.01970: stdout chunk (state=3): >>><<< 12073 1726773109.01992: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.150 closed. 12073 1726773109.02023: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12073 1726773109.02036: _low_level_execute_command(): starting 12073 1726773109.02041: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.7722867-12073-102569596220276/ > /dev/null 2>&1 && sleep 0' 12073 1726773109.04684: stderr chunk (state=2): >>><<< 12073 1726773109.04696: stdout chunk (state=2): >>><<< 12073 1726773109.04720: _low_level_execute_command() done: rc=0, stdout=, stderr= 12073 1726773109.04728: handler run complete 12073 1726773109.04753: attempt loop complete, returning result 12073 1726773109.04766: _execute() done 12073 1726773109.04767: dumping result to json 12073 1726773109.04770: done dumping result, returning 12073 1726773109.04780: done running TaskExecutor() for managed_node2/TASK: Get active_profile [12a3200b-1e9d-1dbd-cc52-000000000c4c] 12073 1726773109.04794: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4c 12073 1726773109.04834: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4c 12073 1726773109.04895: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8119 1726773109.05028: no more pending results, returning what we have 8119 1726773109.05033: results queue empty 8119 1726773109.05035: checking for any_errors_fatal 8119 1726773109.05041: done checking for any_errors_fatal 8119 1726773109.05042: checking for max_fail_percentage 8119 1726773109.05045: done checking for max_fail_percentage 8119 1726773109.05047: checking to see if all hosts have failed and the running result is not ok 8119 1726773109.05049: done checking to see if all hosts have failed 8119 1726773109.05051: getting the remaining hosts for this loop 8119 1726773109.05054: done getting the remaining hosts for this loop 8119 1726773109.05061: building list of next tasks for hosts 8119 1726773109.05064: getting the next task for host managed_node2 8119 1726773109.05071: done getting next task for host managed_node2 8119 1726773109.05073: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8119 1726773109.05077: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773109.05080: done building task lists 8119 1726773109.05082: counting tasks in each state of execution 8119 1726773109.05088: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773109.05091: advancing hosts in ITERATING_ALWAYS 8119 1726773109.05093: starting to advance hosts 8119 1726773109.05095: getting the next task for host managed_node2 8119 1726773109.05099: done getting next task for host managed_node2 8119 1726773109.05101: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8119 1726773109.05104: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773109.05106: done advancing hosts to next task 8119 1726773109.05122: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773109.05126: getting variables 8119 1726773109.05129: in VariableManager get_vars() 8119 1726773109.05162: Calling all_inventory to load vars for managed_node2 8119 1726773109.05168: Calling groups_inventory to load vars for managed_node2 8119 1726773109.05171: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773109.05199: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05211: Calling all_plugins_play to load vars for managed_node2 8119 1726773109.05222: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05231: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773109.05242: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05248: Calling groups_plugins_play to load vars for managed_node2 8119 1726773109.05258: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05275: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05292: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.05492: done with get_vars() 8119 1726773109.05502: done getting variables 8119 1726773109.05507: sending task start callback, copying the task so we can template it temporarily 8119 1726773109.05509: done copying, going to template now 8119 1726773109.05511: done templating 8119 1726773109.05513: here goes the callback... TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 15:11:49 -0400 (0:00:00.339) 0:01:43.611 **** 8119 1726773109.05529: sending task start callback 8119 1726773109.05531: entering _queue_task() for managed_node2/copy 8119 1726773109.05652: worker is 1 (out of 1 available) 8119 1726773109.05692: exiting _queue_task() for managed_node2/copy 8119 1726773109.05761: done queuing things up, now waiting for results queue to drain 8119 1726773109.05765: waiting for pending results... 12085 1726773109.05829: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile 12085 1726773109.05875: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4d 12085 1726773109.05924: calling self._execute() 12085 1726773109.07732: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12085 1726773109.07821: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12085 1726773109.07879: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12085 1726773109.07911: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12085 1726773109.07994: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12085 1726773109.08026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12085 1726773109.08070: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12085 1726773109.08096: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12085 1726773109.08118: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12085 1726773109.08192: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12085 1726773109.08212: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12085 1726773109.08231: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12085 1726773109.08738: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12085 1726773109.08772: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12085 1726773109.08782: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12085 1726773109.08798: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12085 1726773109.08804: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12085 1726773109.08896: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12085 1726773109.08917: starting attempt loop 12085 1726773109.08921: running the handler 12085 1726773109.08928: _low_level_execute_command(): starting 12085 1726773109.08932: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12085 1726773109.11401: stdout chunk (state=2): >>>/root <<< 12085 1726773109.11527: stderr chunk (state=3): >>><<< 12085 1726773109.11533: stdout chunk (state=3): >>><<< 12085 1726773109.11555: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12085 1726773109.11569: _low_level_execute_command(): starting 12085 1726773109.11575: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181 `" && echo ansible-tmp-1726773109.1156337-12085-20525917205181="` echo /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181 `" ) && sleep 0' 12085 1726773109.14315: stdout chunk (state=2): >>>ansible-tmp-1726773109.1156337-12085-20525917205181=/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181 <<< 12085 1726773109.14434: stderr chunk (state=3): >>><<< 12085 1726773109.14440: stdout chunk (state=3): >>><<< 12085 1726773109.14459: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773109.1156337-12085-20525917205181=/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181 , stderr= 12085 1726773109.14603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 12085 1726773109.14654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_stat.py 12085 1726773109.14964: Sending initial data 12085 1726773109.14978: Sent initial data (151 bytes) 12085 1726773109.17378: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp529wvckw /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_stat.py <<< 12085 1726773109.18356: stderr chunk (state=3): >>><<< 12085 1726773109.18361: stdout chunk (state=3): >>><<< 12085 1726773109.18382: done transferring module to remote 12085 1726773109.18397: _low_level_execute_command(): starting 12085 1726773109.18403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/ /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_stat.py && sleep 0' 12085 1726773109.20935: stderr chunk (state=2): >>><<< 12085 1726773109.20945: stdout chunk (state=2): >>><<< 12085 1726773109.20963: _low_level_execute_command() done: rc=0, stdout=, stderr= 12085 1726773109.20966: _low_level_execute_command(): starting 12085 1726773109.20972: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_stat.py && sleep 0' 12085 1726773109.36596: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773109.0068297, "mtime": 1726773106.3328412, "ctime": 1726773106.3328412, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 12085 1726773109.37676: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12085 1726773109.37730: stderr chunk (state=3): >>><<< 12085 1726773109.37735: stdout chunk (state=3): >>><<< 12085 1726773109.37755: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 301990082, "dev": 51713, "nlink": 1, "atime": 1726773109.0068297, "mtime": 1726773106.3328412, "ctime": 1726773106.3328412, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1755096851", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 12085 1726773109.37818: done with _execute_module (stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12085 1726773109.38161: Sending initial data 12085 1726773109.38176: Sent initial data (140 bytes) 12085 1726773109.40677: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp1b1w2vzc /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source <<< 12085 1726773109.40981: stderr chunk (state=3): >>><<< 12085 1726773109.40988: stdout chunk (state=3): >>><<< 12085 1726773109.41017: _low_level_execute_command(): starting 12085 1726773109.41023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/ /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source && sleep 0' 12085 1726773109.43511: stderr chunk (state=2): >>><<< 12085 1726773109.43523: stdout chunk (state=2): >>><<< 12085 1726773109.43541: _low_level_execute_command() done: rc=0, stdout=, stderr= 12085 1726773109.43648: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 12085 1726773109.43691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_copy.py 12085 1726773109.44087: Sending initial data 12085 1726773109.44100: Sent initial data (151 bytes) 12085 1726773109.46429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpystxsjdk /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_copy.py <<< 12085 1726773109.47428: stderr chunk (state=3): >>><<< 12085 1726773109.47432: stdout chunk (state=3): >>><<< 12085 1726773109.47453: done transferring module to remote 12085 1726773109.47464: _low_level_execute_command(): starting 12085 1726773109.47468: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/ /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_copy.py && sleep 0' 12085 1726773109.50006: stderr chunk (state=2): >>><<< 12085 1726773109.50022: stdout chunk (state=2): >>><<< 12085 1726773109.50041: _low_level_execute_command() done: rc=0, stdout=, stderr= 12085 1726773109.50045: _low_level_execute_command(): starting 12085 1726773109.50051: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/AnsiballZ_copy.py && sleep 0' 12085 1726773109.65695: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source", "_original_basename": "tmp1b1w2vzc", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 12085 1726773109.66742: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12085 1726773109.66793: stderr chunk (state=3): >>><<< 12085 1726773109.66799: stdout chunk (state=3): >>><<< 12085 1726773109.66821: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source", "_original_basename": "tmp1b1w2vzc", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12085 1726773109.66851: done with _execute_module (copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source', '_original_basename': 'tmp1b1w2vzc', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12085 1726773109.66867: _low_level_execute_command(): starting 12085 1726773109.66874: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/ > /dev/null 2>&1 && sleep 0' 12085 1726773109.69484: stderr chunk (state=2): >>><<< 12085 1726773109.69497: stdout chunk (state=2): >>><<< 12085 1726773109.69517: _low_level_execute_command() done: rc=0, stdout=, stderr= 12085 1726773109.69527: handler run complete 12085 1726773109.69532: attempt loop complete, returning result 12085 1726773109.69544: _execute() done 12085 1726773109.69547: dumping result to json 12085 1726773109.69552: done dumping result, returning 12085 1726773109.69564: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile [12a3200b-1e9d-1dbd-cc52-000000000c4d] 12085 1726773109.69578: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4d 12085 1726773109.69615: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4d 12085 1726773109.69658: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726773109.1156337-12085-20525917205181/source", "state": "file", "uid": 0 } 8119 1726773109.69801: no more pending results, returning what we have 8119 1726773109.69806: results queue empty 8119 1726773109.69811: checking for any_errors_fatal 8119 1726773109.69816: done checking for any_errors_fatal 8119 1726773109.69818: checking for max_fail_percentage 8119 1726773109.69821: done checking for max_fail_percentage 8119 1726773109.69823: checking to see if all hosts have failed and the running result is not ok 8119 1726773109.69825: done checking to see if all hosts have failed 8119 1726773109.69827: getting the remaining hosts for this loop 8119 1726773109.69830: done getting the remaining hosts for this loop 8119 1726773109.69838: building list of next tasks for hosts 8119 1726773109.69840: getting the next task for host managed_node2 8119 1726773109.69847: done getting next task for host managed_node2 8119 1726773109.69850: ^ task is: TASK: Set profile_mode to auto 8119 1726773109.69854: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=4, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773109.69856: done building task lists 8119 1726773109.69858: counting tasks in each state of execution 8119 1726773109.69862: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773109.69864: advancing hosts in ITERATING_ALWAYS 8119 1726773109.69867: starting to advance hosts 8119 1726773109.69869: getting the next task for host managed_node2 8119 1726773109.69872: done getting next task for host managed_node2 8119 1726773109.69875: ^ task is: TASK: Set profile_mode to auto 8119 1726773109.69877: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=4, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773109.69880: done advancing hosts to next task 8119 1726773109.69897: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773109.69901: getting variables 8119 1726773109.69904: in VariableManager get_vars() 8119 1726773109.69935: Calling all_inventory to load vars for managed_node2 8119 1726773109.69939: Calling groups_inventory to load vars for managed_node2 8119 1726773109.69942: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773109.69964: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.69974: Calling all_plugins_play to load vars for managed_node2 8119 1726773109.69987: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.69998: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773109.70011: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.70018: Calling groups_plugins_play to load vars for managed_node2 8119 1726773109.70027: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.70045: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.70058: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773109.70273: done with get_vars() 8119 1726773109.70286: done getting variables 8119 1726773109.70291: sending task start callback, copying the task so we can template it temporarily 8119 1726773109.70293: done copying, going to template now 8119 1726773109.70295: done templating 8119 1726773109.70296: here goes the callback... TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 15:11:49 -0400 (0:00:00.647) 0:01:44.259 **** 8119 1726773109.70315: sending task start callback 8119 1726773109.70317: entering _queue_task() for managed_node2/copy 8119 1726773109.70441: worker is 1 (out of 1 available) 8119 1726773109.70480: exiting _queue_task() for managed_node2/copy 8119 1726773109.70556: done queuing things up, now waiting for results queue to drain 8119 1726773109.70561: waiting for pending results... 12101 1726773109.70614: running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto 12101 1726773109.70659: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4e 12101 1726773109.70706: calling self._execute() 12101 1726773109.72472: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12101 1726773109.72562: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12101 1726773109.72630: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12101 1726773109.72661: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12101 1726773109.72691: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12101 1726773109.72721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12101 1726773109.72772: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12101 1726773109.72797: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12101 1726773109.72815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12101 1726773109.72897: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12101 1726773109.72915: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12101 1726773109.72931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12101 1726773109.73153: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12101 1726773109.73187: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12101 1726773109.73198: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12101 1726773109.73208: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12101 1726773109.73216: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12101 1726773109.73307: Loading ActionModule 'copy' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12101 1726773109.73324: starting attempt loop 12101 1726773109.73327: running the handler 12101 1726773109.73338: _low_level_execute_command(): starting 12101 1726773109.73343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12101 1726773109.75812: stdout chunk (state=2): >>>/root <<< 12101 1726773109.75930: stderr chunk (state=3): >>><<< 12101 1726773109.75935: stdout chunk (state=3): >>><<< 12101 1726773109.75954: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12101 1726773109.75968: _low_level_execute_command(): starting 12101 1726773109.75973: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326 `" && echo ansible-tmp-1726773109.759623-12101-143681530652326="` echo /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326 `" ) && sleep 0' 12101 1726773109.78685: stdout chunk (state=2): >>>ansible-tmp-1726773109.759623-12101-143681530652326=/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326 <<< 12101 1726773109.78810: stderr chunk (state=3): >>><<< 12101 1726773109.78816: stdout chunk (state=3): >>><<< 12101 1726773109.78838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773109.759623-12101-143681530652326=/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326 , stderr= 12101 1726773109.78976: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/stat-ZIP_DEFLATED 12101 1726773109.79028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_stat.py 12101 1726773109.79331: Sending initial data 12101 1726773109.79345: Sent initial data (151 bytes) 12101 1726773109.81788: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpfv8077ju /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_stat.py <<< 12101 1726773109.82772: stderr chunk (state=3): >>><<< 12101 1726773109.82777: stdout chunk (state=3): >>><<< 12101 1726773109.82800: done transferring module to remote 12101 1726773109.82817: _low_level_execute_command(): starting 12101 1726773109.82822: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/ /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_stat.py && sleep 0' 12101 1726773109.85343: stderr chunk (state=2): >>><<< 12101 1726773109.85353: stdout chunk (state=2): >>><<< 12101 1726773109.85371: _low_level_execute_command() done: rc=0, stdout=, stderr= 12101 1726773109.85374: _low_level_execute_command(): starting 12101 1726773109.85380: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_stat.py && sleep 0' 12101 1726773110.01023: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773104.4708493, "mtime": 1726773106.3328412, "ctime": 1726773106.3328412, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} <<< 12101 1726773110.02117: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12101 1726773110.02168: stderr chunk (state=3): >>><<< 12101 1726773110.02173: stdout chunk (state=3): >>><<< 12101 1726773110.02195: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 308281538, "dev": 51713, "nlink": 1, "atime": 1726773104.4708493, "mtime": 1726773106.3328412, "ctime": 1726773106.3328412, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "51134487", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_md5": false, "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.150 closed. 12101 1726773110.02256: done with _execute_module (stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12101 1726773110.02601: Sending initial data 12101 1726773110.02616: Sent initial data (140 bytes) 12101 1726773110.05125: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmpjfvmkwm2 /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source <<< 12101 1726773110.05435: stderr chunk (state=3): >>><<< 12101 1726773110.05441: stdout chunk (state=3): >>><<< 12101 1726773110.05470: _low_level_execute_command(): starting 12101 1726773110.05475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/ /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source && sleep 0' 12101 1726773110.08028: stderr chunk (state=2): >>><<< 12101 1726773110.08040: stdout chunk (state=2): >>><<< 12101 1726773110.08059: _low_level_execute_command() done: rc=0, stdout=, stderr= 12101 1726773110.08167: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/copy-ZIP_DEFLATED 12101 1726773110.08216: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_copy.py 12101 1726773110.08531: Sending initial data 12101 1726773110.08546: Sent initial data (151 bytes) 12101 1726773110.10975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp1pfvta7m /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_copy.py <<< 12101 1726773110.11980: stderr chunk (state=3): >>><<< 12101 1726773110.11989: stdout chunk (state=3): >>><<< 12101 1726773110.12014: done transferring module to remote 12101 1726773110.12029: _low_level_execute_command(): starting 12101 1726773110.12035: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/ /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_copy.py && sleep 0' 12101 1726773110.14604: stderr chunk (state=2): >>><<< 12101 1726773110.14621: stdout chunk (state=2): >>><<< 12101 1726773110.14642: _low_level_execute_command() done: rc=0, stdout=, stderr= 12101 1726773110.14645: _low_level_execute_command(): starting 12101 1726773110.14652: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/AnsiballZ_copy.py && sleep 0' 12101 1726773110.30544: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source", "_original_basename": "tmpjfvmkwm2", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} <<< 12101 1726773110.31639: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12101 1726773110.31681: stderr chunk (state=3): >>><<< 12101 1726773110.31688: stdout chunk (state=3): >>><<< 12101 1726773110.31708: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source", "_original_basename": "tmpjfvmkwm2", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12101 1726773110.31741: done with _execute_module (copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source', '_original_basename': 'tmpjfvmkwm2', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12101 1726773110.31757: _low_level_execute_command(): starting 12101 1726773110.31763: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/ > /dev/null 2>&1 && sleep 0' 12101 1726773110.34405: stderr chunk (state=2): >>><<< 12101 1726773110.34418: stdout chunk (state=2): >>><<< 12101 1726773110.34439: _low_level_execute_command() done: rc=0, stdout=, stderr= 12101 1726773110.34450: handler run complete 12101 1726773110.34456: attempt loop complete, returning result 12101 1726773110.34470: _execute() done 12101 1726773110.34472: dumping result to json 12101 1726773110.34476: done dumping result, returning 12101 1726773110.34489: done running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto [12a3200b-1e9d-1dbd-cc52-000000000c4e] 12101 1726773110.34502: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4e 12101 1726773110.34540: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4e 12101 1726773110.34544: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726773109.759623-12101-143681530652326/source", "state": "file", "uid": 0 } 8119 1726773110.34801: no more pending results, returning what we have 8119 1726773110.34807: results queue empty 8119 1726773110.34812: checking for any_errors_fatal 8119 1726773110.34818: done checking for any_errors_fatal 8119 1726773110.34820: checking for max_fail_percentage 8119 1726773110.34823: done checking for max_fail_percentage 8119 1726773110.34825: checking to see if all hosts have failed and the running result is not ok 8119 1726773110.34827: done checking to see if all hosts have failed 8119 1726773110.34829: getting the remaining hosts for this loop 8119 1726773110.34831: done getting the remaining hosts for this loop 8119 1726773110.34839: building list of next tasks for hosts 8119 1726773110.34842: getting the next task for host managed_node2 8119 1726773110.34849: done getting next task for host managed_node2 8119 1726773110.34852: ^ task is: TASK: Restart tuned 8119 1726773110.34855: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=5, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773110.34857: done building task lists 8119 1726773110.34858: counting tasks in each state of execution 8119 1726773110.34861: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 1 8119 1726773110.34863: advancing hosts in ITERATING_ALWAYS 8119 1726773110.34864: starting to advance hosts 8119 1726773110.34865: getting the next task for host managed_node2 8119 1726773110.34868: done getting next task for host managed_node2 8119 1726773110.34870: ^ task is: TASK: Restart tuned 8119 1726773110.34871: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=3, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=5, run_state=ITERATING_ALWAYS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8119 1726773110.34873: done advancing hosts to next task 8119 1726773110.34887: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 8119 1726773110.34890: getting variables 8119 1726773110.34893: in VariableManager get_vars() 8119 1726773110.34920: Calling all_inventory to load vars for managed_node2 8119 1726773110.34924: Calling groups_inventory to load vars for managed_node2 8119 1726773110.34927: Calling all_plugins_inventory to load vars for managed_node2 8119 1726773110.34949: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.34959: Calling all_plugins_play to load vars for managed_node2 8119 1726773110.34969: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.34978: Calling groups_plugins_inventory to load vars for managed_node2 8119 1726773110.34991: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.34998: Calling groups_plugins_play to load vars for managed_node2 8119 1726773110.35009: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.35028: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.35042: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.9/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8119 1726773110.35241: done with get_vars() 8119 1726773110.35251: done getting variables 8119 1726773110.35255: sending task start callback, copying the task so we can template it temporarily 8119 1726773110.35257: done copying, going to template now 8119 1726773110.35259: done templating 8119 1726773110.35261: here goes the callback... TASK [Restart tuned] *********************************************************** task path: /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 15:11:50 -0400 (0:00:00.649) 0:01:44.909 **** 8119 1726773110.35275: sending task start callback 8119 1726773110.35277: entering _queue_task() for managed_node2/service 8119 1726773110.35398: worker is 1 (out of 1 available) 8119 1726773110.35439: exiting _queue_task() for managed_node2/service 8119 1726773110.35515: done queuing things up, now waiting for results queue to drain 8119 1726773110.35521: waiting for pending results... 12120 1726773110.35572: running TaskExecutor() for managed_node2/TASK: Restart tuned 12120 1726773110.35620: in run() - task 12a3200b-1e9d-1dbd-cc52-000000000c4f 12120 1726773110.37312: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py 12120 1726773110.37397: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py 12120 1726773110.37460: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py 12120 1726773110.37494: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py 12120 1726773110.37526: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py 12120 1726773110.37554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py 12120 1726773110.37607: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py 12120 1726773110.37637: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py 12120 1726773110.37656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py 12120 1726773110.37732: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py 12120 1726773110.37753: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py 12120 1726773110.37769: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py 12120 1726773110.37932: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12120 1726773110.37936: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12120 1726773110.37938: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12120 1726773110.37940: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12120 1726773110.37942: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12120 1726773110.37944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.37945: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12120 1726773110.37947: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12120 1726773110.37949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12120 1726773110.37966: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12120 1726773110.37970: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12120 1726773110.37972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.38125: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12120 1726773110.38130: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12120 1726773110.38132: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12120 1726773110.38134: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12120 1726773110.38136: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12120 1726773110.38138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.38140: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12120 1726773110.38142: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12120 1726773110.38143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12120 1726773110.38160: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12120 1726773110.38163: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12120 1726773110.38164: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.38253: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/connection 12120 1726773110.38288: Loading Connection 'ssh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12120 1726773110.38299: trying /usr/local/lib/python3.9/site-packages/ansible/plugins/shell 12120 1726773110.38312: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12120 1726773110.38319: Loading ShellModule 'sh' from /usr/local/lib/python3.9/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12120 1726773110.38409: Loading ActionModule 'service' from /usr/local/lib/python3.9/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.9/site-packages/ansible/plugins/action:/usr/local/lib/python3.9/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12120 1726773110.38428: starting attempt loop 12120 1726773110.38431: running the handler 12120 1726773110.38552: _low_level_execute_command(): starting 12120 1726773110.38558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12120 1726773110.41023: stdout chunk (state=2): >>>/root <<< 12120 1726773110.41140: stderr chunk (state=3): >>><<< 12120 1726773110.41145: stdout chunk (state=3): >>><<< 12120 1726773110.41164: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 12120 1726773110.41177: _low_level_execute_command(): starting 12120 1726773110.41184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596 `" && echo ansible-tmp-1726773110.4117153-12120-231878949871596="` echo /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596 `" ) && sleep 0' 12120 1726773110.44059: stdout chunk (state=2): >>>ansible-tmp-1726773110.4117153-12120-231878949871596=/root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596 <<< 12120 1726773110.44189: stderr chunk (state=3): >>><<< 12120 1726773110.44195: stdout chunk (state=3): >>><<< 12120 1726773110.44214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773110.4117153-12120-231878949871596=/root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596 , stderr= 12120 1726773110.44331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-811980a93x9i/ansiballz_cache/systemd-ZIP_DEFLATED 12120 1726773110.44429: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/AnsiballZ_systemd.py 12120 1726773110.44782: Sending initial data 12120 1726773110.44801: Sent initial data (155 bytes) 12120 1726773110.47232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-811980a93x9i/tmp0ma6p_on /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/AnsiballZ_systemd.py <<< 12120 1726773110.48978: stderr chunk (state=3): >>><<< 12120 1726773110.48985: stdout chunk (state=3): >>><<< 12120 1726773110.49012: done transferring module to remote 12120 1726773110.49027: _low_level_execute_command(): starting 12120 1726773110.49031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/ /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/AnsiballZ_systemd.py && sleep 0' 12120 1726773110.51599: stderr chunk (state=2): >>><<< 12120 1726773110.51614: stdout chunk (state=2): >>><<< 12120 1726773110.51637: _low_level_execute_command() done: rc=0, stdout=, stderr= 12120 1726773110.51641: _low_level_execute_command(): starting 12120 1726773110.51648: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/AnsiballZ_systemd.py && sleep 0' 12120 1726773110.77271: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18976768", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 12120 1726773110.77308: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChange<<< 12120 1726773110.77320: stdout chunk (state=3): >>>TimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} <<< 12120 1726773110.78711: stderr chunk (state=3): >>>Shared connection to 10.31.8.150 closed. <<< 12120 1726773110.78757: stderr chunk (state=3): >>><<< 12120 1726773110.78764: stdout chunk (state=3): >>><<< 12120 1726773110.78786: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "658", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18976768", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "multi-user.target shutdown.target", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "no_block": false, "force": null, "masked": null, "user": null, "scope": null}}} , stderr=Shared connection to 10.31.8.150 closed. 12120 1726773110.78919: done with _execute_module (systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.9.27', '_ansible_module_name': 'systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 12120 1726773110.78934: _low_level_execute_command(): starting 12120 1726773110.78940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773110.4117153-12120-231878949871596/ > /dev/null 2>&1 && sleep 0' 12120 1726773110.81567: stderr chunk (state=2): >>><<< 12120 1726773110.81579: stdout chunk (state=2): >>><<< 12120 1726773110.81600: _low_level_execute_command() done: rc=0, stdout=, stderr= 12120 1726773110.81610: handler run complete 12120 1726773110.81616: attempt loop complete, returning result 12120 1726773110.81686: Loading FilterModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12120 1726773110.81692: Loading FilterModule 'gcp_kms_filters' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 12120 1726773110.81695: Loading FilterModule 'ipaddr' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 12120 1726773110.81697: Loading FilterModule 'json_query' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 12120 1726773110.81700: Loading FilterModule 'k8s' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 12120 1726773110.81702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.81705: Loading FilterModule 'network' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 12120 1726773110.81707: Loading FilterModule 'urls' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12120 1726773110.81711: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.9/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12120 1726773110.81741: Loading TestModule 'core' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12120 1726773110.81744: Loading TestModule 'files' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12120 1726773110.81747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.9/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12120 1726773110.81899: dumping result to json 12120 1726773110.82022: done dumping result, returning 12120 1726773110.82037: done running TaskExecutor() for managed_node2/TASK: Restart tuned [12a3200b-1e9d-1dbd-cc52-000000000c4f] 12120 1726773110.82047: sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4f 12120 1726773110.82052: done sending task result for task 12a3200b-1e9d-1dbd-cc52-000000000c4f 12120 1726773110.82054: WORKER PROCESS EXITING ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:57 EDT", "ActiveEnterTimestampMonotonic": "7018940", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice network.target dbus.socket systemd-sysctl.service systemd-journald.socket polkit.service sysinit.target dbus.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:56 EDT", "AssertTimestampMonotonic": "6195387", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ConditionTimestampMonotonic": "6195386", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "658", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:56 EDT", "ExecMainStartTimestampMonotonic": "6196985", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:56 EDT] ; stop_time=[n/a] ; pid=658 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:56 EDT", "InactiveExitTimestampMonotonic": "6197042", "InvocationID": "98f9c935e1cc44368d26499759f58d0e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "658", "MemoryAccounting": "yes", "MemoryCurrent": "18976768", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:57 EDT", "StateChangeTimestampMonotonic": "7018940", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:57 EDT", "WatchdogTimestampMonotonic": "7018938", "WatchdogUSec": "0" } } 8119 1726773110.82529: no more pending results, returning what we have 8119 1726773110.82534: results queue empty 8119 1726773110.82536: checking for any_errors_fatal 8119 1726773110.82540: done checking for any_errors_fatal 8119 1726773110.82542: checking for max_fail_percentage 8119 1726773110.82544: done checking for max_fail_percentage 8119 1726773110.82545: checking to see if all hosts have failed and the running result is not ok 8119 1726773110.82547: done checking to see if all hosts have failed 8119 1726773110.82548: getting the remaining hosts for this loop 8119 1726773110.82550: done getting the remaining hosts for this loop 8119 1726773110.82554: building list of next tasks for hosts 8119 1726773110.82556: getting the next task for host managed_node2 8119 1726773110.82562: done getting next task for host managed_node2 8119 1726773110.82564: ^ task is: TASK: meta (flush_handlers) 8119 1726773110.82567: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82568: done building task lists 8119 1726773110.82569: counting tasks in each state of execution 8119 1726773110.82572: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773110.82574: advancing hosts in ITERATING_TASKS 8119 1726773110.82575: starting to advance hosts 8119 1726773110.82577: getting the next task for host managed_node2 8119 1726773110.82580: done getting next task for host managed_node2 8119 1726773110.82581: ^ task is: TASK: meta (flush_handlers) 8119 1726773110.82585: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82587: done advancing hosts to next task META: ran handlers 8119 1726773110.82607: done queuing things up, now waiting for results queue to drain 8119 1726773110.82609: results queue empty 8119 1726773110.82611: checking for any_errors_fatal 8119 1726773110.82613: done checking for any_errors_fatal 8119 1726773110.82614: checking for max_fail_percentage 8119 1726773110.82615: done checking for max_fail_percentage 8119 1726773110.82617: checking to see if all hosts have failed and the running result is not ok 8119 1726773110.82618: done checking to see if all hosts have failed 8119 1726773110.82619: getting the remaining hosts for this loop 8119 1726773110.82621: done getting the remaining hosts for this loop 8119 1726773110.82624: building list of next tasks for hosts 8119 1726773110.82626: getting the next task for host managed_node2 8119 1726773110.82628: done getting next task for host managed_node2 8119 1726773110.82630: ^ task is: TASK: meta (flush_handlers) 8119 1726773110.82631: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82632: done building task lists 8119 1726773110.82634: counting tasks in each state of execution 8119 1726773110.82635: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 8119 1726773110.82637: advancing hosts in ITERATING_TASKS 8119 1726773110.82638: starting to advance hosts 8119 1726773110.82639: getting the next task for host managed_node2 8119 1726773110.82641: done getting next task for host managed_node2 8119 1726773110.82643: ^ task is: TASK: meta (flush_handlers) 8119 1726773110.82644: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82645: done advancing hosts to next task META: ran handlers 8119 1726773110.82653: done queuing things up, now waiting for results queue to drain 8119 1726773110.82654: results queue empty 8119 1726773110.82656: checking for any_errors_fatal 8119 1726773110.82658: done checking for any_errors_fatal 8119 1726773110.82659: checking for max_fail_percentage 8119 1726773110.82660: done checking for max_fail_percentage 8119 1726773110.82661: checking to see if all hosts have failed and the running result is not ok 8119 1726773110.82662: done checking to see if all hosts have failed 8119 1726773110.82664: getting the remaining hosts for this loop 8119 1726773110.82665: done getting the remaining hosts for this loop 8119 1726773110.82668: building list of next tasks for hosts 8119 1726773110.82669: getting the next task for host managed_node2 8119 1726773110.82672: done getting next task for host managed_node2 8119 1726773110.82673: ^ task is: None 8119 1726773110.82674: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, run_state=ITERATING_COMPLETE, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82676: done building task lists 8119 1726773110.82677: counting tasks in each state of execution 8119 1726773110.82679: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 0 8119 1726773110.82680: all hosts are done, so returning None's for all hosts 8119 1726773110.82681: done queuing things up, now waiting for results queue to drain 8119 1726773110.82685: results queue empty 8119 1726773110.82686: checking for any_errors_fatal 8119 1726773110.82688: done checking for any_errors_fatal 8119 1726773110.82689: checking for max_fail_percentage 8119 1726773110.82690: done checking for max_fail_percentage 8119 1726773110.82691: checking to see if all hosts have failed and the running result is not ok 8119 1726773110.82692: done checking to see if all hosts have failed 8119 1726773110.82694: getting the next task for host managed_node2 8119 1726773110.82696: done getting next task for host managed_node2 8119 1726773110.82697: ^ task is: None 8119 1726773110.82698: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, run_state=ITERATING_COMPLETE, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8119 1726773110.82700: running handlers PLAY RECAP ********************************************************************* managed_node2 : ok=135 changed=19 unreachable=0 failed=0 skipped=58 rescued=0 ignored=0 Thursday 19 September 2024 15:11:50 -0400 (0:00:00.475) 0:01:45.384 **** =============================================================================== Reboot the machine - see if settings persist after reboot -------------- 22.58s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Ensure required packages are installed --------------------------------- 14.51s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 4.87s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.86s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.85s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.79s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.75s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.53s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.52s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.50s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Gathering Facts --------------------------------------------------------- 1.48s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.46s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Check sysfs after role runs --------------------------------------------- 1.37s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:79 Ensure required services are enabled and started ------------------------ 1.21s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.85s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.84s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Generate a configuration for kernel settings ---------------------------- 0.83s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.80s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.79s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.73s /tmp/collections-h7o/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 8119 1726773110.82848: RUNNING CLEANUP