34052 1727204412.96270: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34052 1727204412.97162: Added group all to inventory 34052 1727204412.97164: Added group ungrouped to inventory 34052 1727204412.97373: Group all now contains ungrouped 34052 1727204412.97379: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 34052 1727204413.21584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34052 1727204413.21656: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34052 1727204413.21685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34052 1727204413.21755: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34052 1727204413.21840: Loaded config def from plugin (inventory/script) 34052 1727204413.21842: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34052 1727204413.21889: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34052 1727204413.21990: Loaded config def from plugin (inventory/yaml) 34052 1727204413.21992: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34052 1727204413.22090: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34052 1727204413.22573: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34052 1727204413.22578: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34052 1727204413.22582: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34052 1727204413.22589: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34052 1727204413.22594: Loading data from /tmp/network-jrl/inventory-0Xx.yml 34052 1727204413.22670: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 34052 1727204413.22742: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34052 1727204413.22789: Loading data from /tmp/network-jrl/inventory-0Xx.yml 34052 1727204413.22891: group all already in inventory 34052 1727204413.22898: set inventory_file for managed-node1 34052 1727204413.22903: set inventory_dir for managed-node1 34052 1727204413.22904: Added host managed-node1 to inventory 34052 1727204413.22906: Added host managed-node1 to group all 34052 1727204413.22908: set ansible_host for managed-node1 34052 1727204413.22908: set ansible_ssh_extra_args for managed-node1 34052 1727204413.22912: set inventory_file for managed-node2 34052 1727204413.22914: set inventory_dir for managed-node2 34052 1727204413.22915: Added host managed-node2 to inventory 34052 1727204413.22917: Added host managed-node2 to group all 34052 1727204413.22918: set ansible_host for managed-node2 34052 1727204413.22919: set ansible_ssh_extra_args for managed-node2 34052 1727204413.22921: set inventory_file for managed-node3 34052 1727204413.22924: set inventory_dir for managed-node3 34052 1727204413.22924: Added host managed-node3 to inventory 34052 1727204413.22926: Added host managed-node3 to group all 34052 1727204413.22926: set ansible_host for managed-node3 34052 1727204413.22927: set ansible_ssh_extra_args for managed-node3 34052 1727204413.22930: Reconcile groups and hosts in inventory. 34052 1727204413.22934: Group ungrouped now contains managed-node1 34052 1727204413.22936: Group ungrouped now contains managed-node2 34052 1727204413.22938: Group ungrouped now contains managed-node3 34052 1727204413.23030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34052 1727204413.23173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34052 1727204413.23225: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34052 1727204413.23256: Loaded config def from plugin (vars/host_group_vars) 34052 1727204413.23258: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34052 1727204413.23269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34052 1727204413.23279: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34052 1727204413.23328: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34052 1727204413.23898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204413.24125: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34052 1727204413.24177: Loaded config def from plugin (connection/local) 34052 1727204413.24182: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34052 1727204413.25926: Loaded config def from plugin (connection/paramiko_ssh) 34052 1727204413.25932: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34052 1727204413.28057: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34052 1727204413.28123: Loaded config def from plugin (connection/psrp) 34052 1727204413.28128: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34052 1727204413.29040: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34052 1727204413.29092: Loaded config def from plugin (connection/ssh) 34052 1727204413.29097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34052 1727204413.31545: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34052 1727204413.31596: Loaded config def from plugin (connection/winrm) 34052 1727204413.31600: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34052 1727204413.31642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34052 1727204413.31718: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34052 1727204413.31800: Loaded config def from plugin (shell/cmd) 34052 1727204413.31802: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34052 1727204413.31832: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34052 1727204413.31910: Loaded config def from plugin (shell/powershell) 34052 1727204413.31913: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34052 1727204413.31980: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34052 1727204413.32189: Loaded config def from plugin (shell/sh) 34052 1727204413.32191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34052 1727204413.32231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34052 1727204413.32376: Loaded config def from plugin (become/runas) 34052 1727204413.32379: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34052 1727204413.32602: Loaded config def from plugin (become/su) 34052 1727204413.32604: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34052 1727204413.32793: Loaded config def from plugin (become/sudo) 34052 1727204413.32796: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34052 1727204413.32842: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 34052 1727204413.33255: in VariableManager get_vars() 34052 1727204413.33286: done with get_vars() 34052 1727204413.33454: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34052 1727204413.40450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34052 1727204413.40788: in VariableManager get_vars() 34052 1727204413.40794: done with get_vars() 34052 1727204413.40797: variable 'playbook_dir' from source: magic vars 34052 1727204413.40798: variable 'ansible_playbook_python' from source: magic vars 34052 1727204413.40799: variable 'ansible_config_file' from source: magic vars 34052 1727204413.40800: variable 'groups' from source: magic vars 34052 1727204413.40801: variable 'omit' from source: magic vars 34052 1727204413.40801: variable 'ansible_version' from source: magic vars 34052 1727204413.40802: variable 'ansible_check_mode' from source: magic vars 34052 1727204413.40803: variable 'ansible_diff_mode' from source: magic vars 34052 1727204413.40803: variable 'ansible_forks' from source: magic vars 34052 1727204413.40804: variable 'ansible_inventory_sources' from source: magic vars 34052 1727204413.40805: variable 'ansible_skip_tags' from source: magic vars 34052 1727204413.40806: variable 'ansible_limit' from source: magic vars 34052 1727204413.40806: variable 'ansible_run_tags' from source: magic vars 34052 1727204413.40808: variable 'ansible_verbosity' from source: magic vars 34052 1727204413.40853: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 34052 1727204413.42361: in VariableManager get_vars() 34052 1727204413.42386: done with get_vars() 34052 1727204413.42432: in VariableManager get_vars() 34052 1727204413.42450: done with get_vars() 34052 1727204413.43842: in VariableManager get_vars() 34052 1727204413.44123: done with get_vars() 34052 1727204413.44133: variable 'omit' from source: magic vars 34052 1727204413.44164: variable 'omit' from source: magic vars 34052 1727204413.44216: in VariableManager get_vars() 34052 1727204413.44235: done with get_vars() 34052 1727204413.44300: in VariableManager get_vars() 34052 1727204413.44317: done with get_vars() 34052 1727204413.44360: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34052 1727204413.45166: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34052 1727204413.45631: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34052 1727204413.46943: in VariableManager get_vars() 34052 1727204413.46974: done with get_vars() 34052 1727204413.47489: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34052 1727204413.47674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34052 1727204413.50562: in VariableManager get_vars() 34052 1727204413.50589: done with get_vars() 34052 1727204413.50637: in VariableManager get_vars() 34052 1727204413.50676: done with get_vars() 34052 1727204413.52136: in VariableManager get_vars() 34052 1727204413.52277: done with get_vars() 34052 1727204413.52283: variable 'omit' from source: magic vars 34052 1727204413.52296: variable 'omit' from source: magic vars 34052 1727204413.52333: in VariableManager get_vars() 34052 1727204413.52349: done with get_vars() 34052 1727204413.52500: in VariableManager get_vars() 34052 1727204413.52521: done with get_vars() 34052 1727204413.52557: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34052 1727204413.52761: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34052 1727204413.53082: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34052 1727204413.57951: in VariableManager get_vars() 34052 1727204413.57980: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34052 1727204413.62281: in VariableManager get_vars() 34052 1727204413.62310: done with get_vars() 34052 1727204413.62585: in VariableManager get_vars() 34052 1727204413.62614: done with get_vars() 34052 1727204413.62683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34052 1727204413.62698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34052 1727204413.63215: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34052 1727204413.63720: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34052 1727204413.63725: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 34052 1727204413.63763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34052 1727204413.63916: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34052 1727204413.64490: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34052 1727204413.64627: Loaded config def from plugin (callback/default) 34052 1727204413.64631: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 34052 1727204413.67813: Loaded config def from plugin (callback/junit) 34052 1727204413.67817: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 34052 1727204413.67933: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34052 1727204413.68019: Loaded config def from plugin (callback/minimal) 34052 1727204413.68022: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 34052 1727204413.68078: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 34052 1727204413.68139: Loaded config def from plugin (callback/tree) 34052 1727204413.68141: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34052 1727204413.68363: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34052 1727204413.68369: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 34052 1727204413.68406: in VariableManager get_vars() 34052 1727204413.68423: done with get_vars() 34052 1727204413.68430: in VariableManager get_vars() 34052 1727204413.68439: done with get_vars() 34052 1727204413.68443: variable 'omit' from source: magic vars 34052 1727204413.68579: in VariableManager get_vars() 34052 1727204413.68603: done with get_vars() 34052 1727204413.68630: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 34052 1727204413.70454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34052 1727204413.70668: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34052 1727204413.70762: getting the remaining hosts for this loop 34052 1727204413.70765: done getting the remaining hosts for this loop 34052 1727204413.70770: getting the next task for host managed-node1 34052 1727204413.70775: done getting next task for host managed-node1 34052 1727204413.70778: ^ task is: TASK: Gathering Facts 34052 1727204413.70779: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204413.70782: getting variables 34052 1727204413.70783: in VariableManager get_vars() 34052 1727204413.70797: Calling all_inventory to load vars for managed-node1 34052 1727204413.70800: Calling groups_inventory to load vars for managed-node1 34052 1727204413.70803: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204413.70819: Calling all_plugins_play to load vars for managed-node1 34052 1727204413.70831: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204413.70835: Calling groups_plugins_play to load vars for managed-node1 34052 1727204413.71003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204413.71164: done with get_vars() 34052 1727204413.71311: done getting variables 34052 1727204413.71397: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Tuesday 24 September 2024 15:00:13 -0400 (0:00:00.032) 0:00:00.032 ***** 34052 1727204413.71496: entering _queue_task() for managed-node1/gather_facts 34052 1727204413.71497: Creating lock for gather_facts 34052 1727204413.72206: worker is 1 (out of 1 available) 34052 1727204413.72219: exiting _queue_task() for managed-node1/gather_facts 34052 1727204413.72236: done queuing things up, now waiting for results queue to drain 34052 1727204413.72238: waiting for pending results... 34052 1727204413.72734: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34052 1727204413.72907: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000b9 34052 1727204413.72913: variable 'ansible_search_path' from source: unknown 34052 1727204413.73088: calling self._execute() 34052 1727204413.73169: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204413.73180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204413.73184: variable 'omit' from source: magic vars 34052 1727204413.73558: variable 'omit' from source: magic vars 34052 1727204413.73562: variable 'omit' from source: magic vars 34052 1727204413.73588: variable 'omit' from source: magic vars 34052 1727204413.73643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204413.73791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204413.73815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204413.73840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204413.73884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204413.73889: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204413.73892: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204413.73894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204413.74337: Set connection var ansible_connection to ssh 34052 1727204413.74341: Set connection var ansible_timeout to 10 34052 1727204413.74344: Set connection var ansible_pipelining to False 34052 1727204413.74347: Set connection var ansible_shell_type to sh 34052 1727204413.74428: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204413.74432: Set connection var ansible_shell_executable to /bin/sh 34052 1727204413.74435: variable 'ansible_shell_executable' from source: unknown 34052 1727204413.74439: variable 'ansible_connection' from source: unknown 34052 1727204413.74441: variable 'ansible_module_compression' from source: unknown 34052 1727204413.74443: variable 'ansible_shell_type' from source: unknown 34052 1727204413.74445: variable 'ansible_shell_executable' from source: unknown 34052 1727204413.74448: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204413.74450: variable 'ansible_pipelining' from source: unknown 34052 1727204413.74452: variable 'ansible_timeout' from source: unknown 34052 1727204413.74454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204413.75101: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204413.75119: variable 'omit' from source: magic vars 34052 1727204413.75122: starting attempt loop 34052 1727204413.75125: running the handler 34052 1727204413.75172: variable 'ansible_facts' from source: unknown 34052 1727204413.75176: _low_level_execute_command(): starting 34052 1727204413.75178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204413.76936: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204413.76941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204413.76945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204413.77035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204413.77040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204413.77236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204413.77251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204413.79046: stdout chunk (state=3): >>>/root <<< 34052 1727204413.79150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204413.79272: stderr chunk (state=3): >>><<< 34052 1727204413.79276: stdout chunk (state=3): >>><<< 34052 1727204413.79573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204413.79577: _low_level_execute_command(): starting 34052 1727204413.79581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682 `" && echo ansible-tmp-1727204413.7932034-34160-60684122265682="` echo /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682 `" ) && sleep 0' 34052 1727204413.80843: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204413.80893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204413.80901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204413.80917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204413.80931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204413.80934: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204413.81072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204413.81348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204413.81396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204413.83673: stdout chunk (state=3): >>>ansible-tmp-1727204413.7932034-34160-60684122265682=/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682 <<< 34052 1727204413.83678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204413.83680: stderr chunk (state=3): >>><<< 34052 1727204413.83837: stdout chunk (state=3): >>><<< 34052 1727204413.83841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204413.7932034-34160-60684122265682=/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204413.83844: variable 'ansible_module_compression' from source: unknown 34052 1727204413.83884: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34052 1727204413.83888: ANSIBALLZ: Acquiring lock 34052 1727204413.83891: ANSIBALLZ: Lock acquired: 140141530567488 34052 1727204413.83893: ANSIBALLZ: Creating module 34052 1727204414.29267: ANSIBALLZ: Writing module into payload 34052 1727204414.29572: ANSIBALLZ: Writing module 34052 1727204414.29577: ANSIBALLZ: Renaming module 34052 1727204414.29579: ANSIBALLZ: Done creating module 34052 1727204414.29581: variable 'ansible_facts' from source: unknown 34052 1727204414.29584: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204414.29586: _low_level_execute_command(): starting 34052 1727204414.29589: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34052 1727204414.30268: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204414.30334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204414.30389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204414.30411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204414.30454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204414.30570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204414.32340: stdout chunk (state=3): >>>PLATFORM <<< 34052 1727204414.32514: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 <<< 34052 1727204414.32673: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34052 1727204414.32677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204414.32683: stdout chunk (state=3): >>><<< 34052 1727204414.32690: stderr chunk (state=3): >>><<< 34052 1727204414.32890: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204414.32975 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34052 1727204414.32979: _low_level_execute_command(): starting 34052 1727204414.32982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34052 1727204414.33157: Sending initial data 34052 1727204414.33160: Sent initial data (1181 bytes) 34052 1727204414.34390: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204414.34599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204414.34689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204414.38452: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 34052 1727204414.38957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204414.38992: stderr chunk (state=3): >>><<< 34052 1727204414.39062: stdout chunk (state=3): >>><<< 34052 1727204414.39085: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204414.39411: variable 'ansible_facts' from source: unknown 34052 1727204414.39414: variable 'ansible_facts' from source: unknown 34052 1727204414.39473: variable 'ansible_module_compression' from source: unknown 34052 1727204414.39480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34052 1727204414.39501: variable 'ansible_facts' from source: unknown 34052 1727204414.39832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py 34052 1727204414.40382: Sending initial data 34052 1727204414.40391: Sent initial data (153 bytes) 34052 1727204414.41845: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204414.42074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204414.42079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204414.42098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204414.42154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204414.42179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204414.42343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204414.42541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204414.44203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204414.44446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204414.44533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpvnb96l7x /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py <<< 34052 1727204414.44537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py" <<< 34052 1727204414.44571: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpvnb96l7x" to remote "/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py" <<< 34052 1727204414.49374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204414.49379: stdout chunk (state=3): >>><<< 34052 1727204414.49381: stderr chunk (state=3): >>><<< 34052 1727204414.49383: done transferring module to remote 34052 1727204414.49385: _low_level_execute_command(): starting 34052 1727204414.49387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/ /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py && sleep 0' 34052 1727204414.50847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204414.50863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204414.51178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204414.51264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204414.53290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204414.53308: stderr chunk (state=3): >>><<< 34052 1727204414.53316: stdout chunk (state=3): >>><<< 34052 1727204414.53338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204414.53346: _low_level_execute_command(): starting 34052 1727204414.53355: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/AnsiballZ_setup.py && sleep 0' 34052 1727204414.54653: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204414.54672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204414.54703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204414.54884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204414.54923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204414.55034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204414.55051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204414.55206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204414.57668: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34052 1727204414.57752: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34052 1727204414.57781: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34052 1727204414.57818: stdout chunk (state=3): >>>import 'posix' # <<< 34052 1727204414.57854: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34052 1727204414.57891: stdout chunk (state=3): >>>import 'time' # <<< 34052 1727204414.57894: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 34052 1727204414.58018: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 34052 1727204414.58033: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34052 1727204414.58071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed381a4530> <<< 34052 1727204414.58180: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38173b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed381a6ab0> import '_signal' # <<< 34052 1727204414.58282: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 34052 1727204414.58380: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 34052 1727204414.58399: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 34052 1727204414.58421: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34052 1727204414.58437: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34052 1727204414.58510: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34052 1727204414.58538: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f551c0> <<< 34052 1727204414.58613: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f560c0> <<< 34052 1727204414.58641: stdout chunk (state=3): >>>import 'site' # <<< 34052 1727204414.58728: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34052 1727204414.59096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34052 1727204414.59115: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34052 1727204414.59161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34052 1727204414.59196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34052 1727204414.59282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f93ef0> <<< 34052 1727204414.59289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34052 1727204414.59323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34052 1727204414.59327: stdout chunk (state=3): >>>import '_operator' # <<< 34052 1727204414.59330: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f93fb0> <<< 34052 1727204414.59368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34052 1727204414.59671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fcb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fcbf20> import '_collections' # <<< 34052 1727204414.59691: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fabbc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fa92e0> <<< 34052 1727204414.59754: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f910a0> <<< 34052 1727204414.59783: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34052 1727204414.59808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34052 1727204414.59935: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34052 1727204414.59939: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fef770> <<< 34052 1727204414.59950: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fee390> <<< 34052 1727204414.59983: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 34052 1727204414.59994: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37faa2d0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f92960> <<< 34052 1727204414.60172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34052 1727204414.60176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed380207a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f90320> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed38020c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38020b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.60188: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed38020ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f8ee40> <<< 34052 1727204414.60216: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.60241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34052 1727204414.60271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34052 1727204414.60294: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38021550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38021220> <<< 34052 1727204414.60396: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38022450> import 'importlib.util' # import 'runpy' # <<< 34052 1727204414.60415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34052 1727204414.60445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34052 1727204414.60472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 34052 1727204414.60485: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803c680> <<< 34052 1727204414.60513: stdout chunk (state=3): >>>import 'errno' # <<< 34052 1727204414.60591: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803ddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34052 1727204414.60596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 34052 1727204414.60615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34052 1727204414.60632: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803ec60> <<< 34052 1727204414.60732: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803f2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803e1b0> <<< 34052 1727204414.60771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.60775: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803fd40> <<< 34052 1727204414.60777: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803f470> <<< 34052 1727204414.60847: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed380224b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34052 1727204414.60880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34052 1727204414.60891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34052 1727204414.61086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d33c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34052 1727204414.61190: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d31df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34052 1727204414.61228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34052 1727204414.61249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34052 1727204414.61267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34052 1727204414.61279: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5df70> <<< 34052 1727204414.61320: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5cbf0> <<< 34052 1727204414.61331: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38022630> <<< 34052 1727204414.61379: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34052 1727204414.61405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.61423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34052 1727204414.61473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34052 1727204414.61501: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d8a300> <<< 34052 1727204414.61552: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34052 1727204414.61566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.61643: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34052 1727204414.61667: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da2480> <<< 34052 1727204414.61752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34052 1727204414.61791: stdout chunk (state=3): >>>import 'ntpath' # <<< 34052 1727204414.61819: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ddf230> <<< 34052 1727204414.61973: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34052 1727204414.61977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34052 1727204414.62052: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37e019d0> <<< 34052 1727204414.62188: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ddf350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da3110> <<< 34052 1727204414.62205: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c24380> <<< 34052 1727204414.62231: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da14c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5eea0> <<< 34052 1727204414.62407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34052 1727204414.62422: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fed37c24620> <<< 34052 1727204414.62613: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sowwmm02/ansible_ansible.legacy.setup_payload.zip' <<< 34052 1727204414.62696: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.62761: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.62785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34052 1727204414.62836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34052 1727204414.62847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34052 1727204414.62931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34052 1727204414.63070: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c8a150> import '_typing' # <<< 34052 1727204414.63173: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c61040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c601a0> # zipimport: zlib available <<< 34052 1727204414.63212: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34052 1727204414.63238: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.63254: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.63397: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34052 1727204414.63400: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.64857: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.66201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c63fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 34052 1727204414.66224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34052 1727204414.66498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbdb50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbd8e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbd1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34052 1727204414.66504: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbdc40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c8ab70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbe840> <<< 34052 1727204414.66520: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbea80> <<< 34052 1727204414.66533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34052 1727204414.66586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34052 1727204414.66603: stdout chunk (state=3): >>>import '_locale' # <<< 34052 1727204414.66647: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbef60> <<< 34052 1727204414.66660: stdout chunk (state=3): >>>import 'pwd' # <<< 34052 1727204414.66723: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34052 1727204414.66755: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b20d10> <<< 34052 1727204414.66829: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.66832: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b22930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34052 1727204414.66863: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b232f0> <<< 34052 1727204414.66880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34052 1727204414.66940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b241d0> <<< 34052 1727204414.66951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34052 1727204414.67174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b26f60> <<< 34052 1727204414.67181: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b27080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b25220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34052 1727204414.67215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34052 1727204414.67218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 34052 1727204414.67220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34052 1727204414.67232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34052 1727204414.67270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34052 1727204414.67495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b2aea0> import '_tokenize' # <<< 34052 1727204414.67499: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b29970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b296d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34052 1727204414.67519: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b2bfb0> <<< 34052 1727204414.67543: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b25730> <<< 34052 1727204414.67565: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.67579: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b6f0e0> <<< 34052 1727204414.67610: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b6f260> <<< 34052 1727204414.67726: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b74e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b74bc0> <<< 34052 1727204414.67738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34052 1727204414.67878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34052 1727204414.67936: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.67949: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b77320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b754c0> <<< 34052 1727204414.67981: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34052 1727204414.68058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34052 1727204414.68102: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b7ea80> <<< 34052 1727204414.68241: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b77410> <<< 34052 1727204414.68322: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fd40> <<< 34052 1727204414.68374: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fb60> <<< 34052 1727204414.68500: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fd70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b6f4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34052 1727204414.68520: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.68554: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b834a0> <<< 34052 1727204414.68725: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.68749: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b84920> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b81c10> <<< 34052 1727204414.68803: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.68975: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b82fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b81820> # zipimport: zlib available <<< 34052 1727204414.69021: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.69038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.69061: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34052 1727204414.69093: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34052 1727204414.69107: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.69237: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.69370: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.70014: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.70685: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34052 1727204414.70708: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.70763: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a0c8f0> <<< 34052 1727204414.70884: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0d640> <<< 34052 1727204414.70887: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b82750> <<< 34052 1727204414.70960: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34052 1727204414.70982: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34052 1727204414.71137: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.71339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34052 1727204414.71378: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0d310> # zipimport: zlib available <<< 34052 1727204414.71999: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72374: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72448: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72525: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34052 1727204414.72649: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72669: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34052 1727204414.72871: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72874: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34052 1727204414.72882: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72884: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34052 1727204414.72886: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72888: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.72986: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34052 1727204414.73194: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.73453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34052 1727204414.73524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34052 1727204414.73640: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0f980> # zipimport: zlib available <<< 34052 1727204414.73697: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.73787: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34052 1727204414.73798: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34052 1727204414.73825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34052 1727204414.73980: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.74064: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a16030> <<< 34052 1727204414.74189: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a169c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0f200> # zipimport: zlib available <<< 34052 1727204414.74206: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74251: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 34052 1727204414.74299: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74346: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74409: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74488: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34052 1727204414.74536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.74637: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a15730> <<< 34052 1727204414.74681: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a16b40> <<< 34052 1727204414.74729: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34052 1727204414.74740: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74799: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74879: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74904: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.74961: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204414.74983: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34052 1727204414.75031: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34052 1727204414.75034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34052 1727204414.75103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34052 1727204414.75127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34052 1727204414.75141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34052 1727204414.75195: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37aaec90> <<< 34052 1727204414.75244: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a209e0> <<< 34052 1727204414.75340: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a1eab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a1e900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34052 1727204414.75364: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75401: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75412: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34052 1727204414.75490: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34052 1727204414.75517: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 34052 1727204414.75587: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75658: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75689: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75746: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75790: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75829: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34052 1727204414.75884: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.75969: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.76045: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.76077: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.76121: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34052 1727204414.76124: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.76573: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.76661: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34052 1727204414.76683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34052 1727204414.76723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34052 1727204414.76789: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab1a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34052 1727204414.76792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34052 1727204414.76804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34052 1727204414.76875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34052 1727204414.76894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f48140> <<< 34052 1727204414.76918: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.76955: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f48440> <<< 34052 1727204414.77109: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a91190> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a92360> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab0170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab0b90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34052 1727204414.77199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34052 1727204414.77203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34052 1727204414.77217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34052 1727204414.77316: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f4b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4acf0> <<< 34052 1727204414.77332: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f4aed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4a150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34052 1727204414.77535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34052 1727204414.77547: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4b530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36fb6030> <<< 34052 1727204414.77574: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4bf80> <<< 34052 1727204414.77633: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab38c0> import 'ansible.module_utils.facts.timeout' # <<< 34052 1727204414.77752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 34052 1727204414.77771: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.77814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34052 1727204414.77818: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.77916: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.77968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 34052 1727204414.78129: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34052 1727204414.78133: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.78159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34052 1727204414.78202: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.78351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34052 1727204414.78372: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.78389: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.78456: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.78517: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34052 1727204414.78531: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79095: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79621: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 34052 1727204414.79677: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79734: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79762: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79810: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 34052 1727204414.79820: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79863: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.79892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 34052 1727204414.79949: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80024: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34052 1727204414.80030: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80056: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34052 1727204414.80116: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80152: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34052 1727204414.80183: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80257: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34052 1727204414.80382: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fb7b60> <<< 34052 1727204414.80428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34052 1727204414.80441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34052 1727204414.80579: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fb6d20> <<< 34052 1727204414.80590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 34052 1727204414.80653: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 34052 1727204414.80837: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.80947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34052 1727204414.80950: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.81272: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.81277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34052 1727204414.81328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34052 1727204414.81340: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.81412: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36fe2330> <<< 34052 1727204414.81696: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fceb10> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.81762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34052 1727204414.81776: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.82013: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.82071: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.82250: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34052 1727204414.82317: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.82443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34052 1727204414.82486: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204414.82544: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36ffdaf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fcf410> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 34052 1727204414.82590: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34052 1727204414.82604: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.82676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34052 1727204414.82873: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 34052 1727204414.83115: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83231: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83268: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 34052 1727204414.83378: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.83551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34052 1727204414.83706: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83837: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.83978: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 34052 1727204414.84019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.84054: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.84720: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.85386: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34052 1727204414.85538: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.85542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34052 1727204414.85545: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.85696: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.85761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34052 1727204414.85779: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.85933: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.86110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34052 1727204414.86147: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34052 1727204414.86368: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.86398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.86459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.86688: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.86907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34052 1727204414.86942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 34052 1727204414.86966: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 34052 1727204414.87019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87172: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.87228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34052 1727204414.87300: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87311: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 34052 1727204414.87393: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 34052 1727204414.87440: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87503: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.87631: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 34052 1727204414.87863: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 34052 1727204414.88275: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 34052 1727204414.88356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88386: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 34052 1727204414.88487: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34052 1727204414.88497: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34052 1727204414.88546: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88626: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34052 1727204414.88745: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 34052 1727204414.88797: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34052 1727204414.88921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.88934: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204414.89032: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89036: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89132: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34052 1727204414.89263: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89269: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89347: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 34052 1727204414.89538: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34052 1727204414.89781: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89809: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34052 1727204414.89878: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89917: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.89971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34052 1727204414.89987: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.90064: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.90157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34052 1727204414.90176: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.90269: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.90376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34052 1727204414.90380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34052 1727204414.90456: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204414.91000: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34052 1727204414.91005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34052 1727204414.91062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34052 1727204414.91086: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36e27200> <<< 34052 1727204414.91089: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e251c0> <<< 34052 1727204414.91121: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e25e50> <<< 34052 1727204415.07942: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34052 1727204415.07946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 34052 1727204415.07988: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e6d4c0> <<< 34052 1727204415.08005: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 34052 1727204415.08026: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e6e780> <<< 34052 1727204415.08079: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 34052 1727204415.08099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.08138: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 34052 1727204415.08156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36ec0a40> <<< 34052 1727204415.08178: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36ec0530> <<< 34052 1727204415.08506: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34052 1727204415.28890: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "14", "epoch": "1727204414", "epoch_int": "1727204414", "date": "2024-09-24", "time": "15:00:14", "iso8601_micro": "2024-09-24T19:00:14.913401Z", "iso8601": "2024-09-24T19:00:14Z", "iso8601_basic": "20240924T150014913401", "iso8601_basic_short": "20240924T150014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.48388671875, "15m": 0.2998046875}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "L<<< 34052 1727204415.28904: stdout chunk (state=3): >>>ANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10d0:dfff:fe0f:c94d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.176"], "ansible_all_ipv6_addresses": ["fe80::10d0:dfff:fe0f:c94d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.8.176", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10d0:dfff:fe0f:c94d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 679, "free": 3037}, "nocache": {"free": 3476, "used": 240}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_uuid": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_version": "4.<<< 34052 1727204415.28940: stdout chunk (state=3): >>>11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251314958336, "block_size": 4096, "block_total": 64479564, "block_available": 61356191, "block_used": 3123373, "inode_total": 16384000, "inode_available": 16301496, "inode_used": 82504, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34052 1727204415.29830: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator <<< 34052 1727204415.30220: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34052 1727204415.30475: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34052 1727204415.30496: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 34052 1727204415.30538: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34052 1727204415.30596: stdout chunk (state=3): >>># destroy ntpath <<< 34052 1727204415.30608: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 34052 1727204415.30644: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 34052 1727204415.30765: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 34052 1727204415.30770: stdout chunk (state=3): >>># destroy distro <<< 34052 1727204415.30836: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging <<< 34052 1727204415.30858: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 34052 1727204415.30883: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 34052 1727204415.30908: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 34052 1727204415.30968: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34052 1727204415.31089: stdout chunk (state=3): >>># destroy _ssl <<< 34052 1727204415.31107: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 34052 1727204415.31154: stdout chunk (state=3): >>># destroy json <<< 34052 1727204415.31176: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno <<< 34052 1727204415.31370: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context<<< 34052 1727204415.31386: stdout chunk (state=3): >>> # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 34052 1727204415.31442: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 34052 1727204415.31472: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 34052 1727204415.31495: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34052 1727204415.31730: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34052 1727204415.31767: stdout chunk (state=3): >>># destroy _collections <<< 34052 1727204415.31818: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34052 1727204415.31844: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34052 1727204415.31950: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34052 1727204415.32083: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 34052 1727204415.32098: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 34052 1727204415.32142: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 34052 1727204415.32196: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 34052 1727204415.32215: stdout chunk (state=3): >>># destroy itertools <<< 34052 1727204415.32336: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34052 1727204415.33060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204415.33071: stdout chunk (state=3): >>><<< 34052 1727204415.33074: stderr chunk (state=3): >>><<< 34052 1727204415.33389: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed381a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38173b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed381a6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f551c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f560c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f93ef0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f93fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fcb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fcbf20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fabbc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fa92e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f910a0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fef770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37fee390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37faa2d0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f92960> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed380207a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f90320> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed38020c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38020b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed38020ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37f8ee40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38021550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38021220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38022450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803ddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803ec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803f2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803e1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed3803fd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed3803f470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed380224b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d33c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37d5c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d31df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5df70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5cbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed38022630> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d8a300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da2480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ddf230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37e019d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ddf350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da3110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c24380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37da14c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37d5eea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fed37c24620> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sowwmm02/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c8a150> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c61040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c601a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c63fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbdb50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbd8e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbd1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbdc40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37c8ab70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbe840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37cbea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37cbef60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b20d10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b22930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b232f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b241d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b26f60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b27080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b25220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b2aea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b29970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b296d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b2bfb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b25730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b6f0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b6f260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b74e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b74bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b77320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b754c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b7ea80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b77410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fd40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b7fd70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b6f4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b834a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b84920> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b81c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37b82fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b81820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a0c8f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0d640> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37b82750> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0d310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0f980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a16030> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a169c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a0f200> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed37a15730> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a16b40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37aaec90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a209e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a1eab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a1e900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab1a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f48140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f48440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a91190> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37a92360> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab0170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab0b90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f4b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36f4aed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4a150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4b530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36fb6030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36f4bf80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed37ab38c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fb7b60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fb6d20> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36fe2330> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fceb10> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36ffdaf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36fcf410> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fed36e27200> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e251c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e25e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e6d4c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36e6e780> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36ec0a40> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fed36ec0530> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "14", "epoch": "1727204414", "epoch_int": "1727204414", "date": "2024-09-24", "time": "15:00:14", "iso8601_micro": "2024-09-24T19:00:14.913401Z", "iso8601": "2024-09-24T19:00:14Z", "iso8601_basic": "20240924T150014913401", "iso8601_basic_short": "20240924T150014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.48388671875, "15m": 0.2998046875}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10d0:dfff:fe0f:c94d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.176"], "ansible_all_ipv6_addresses": ["fe80::10d0:dfff:fe0f:c94d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.8.176", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10d0:dfff:fe0f:c94d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 679, "free": 3037}, "nocache": {"free": 3476, "used": 240}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_uuid": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251314958336, "block_size": 4096, "block_total": 64479564, "block_available": 61356191, "block_used": 3123373, "inode_total": 16384000, "inode_available": 16301496, "inode_used": 82504, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34052 1727204415.37834: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204415.37936: _low_level_execute_command(): starting 34052 1727204415.37940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204413.7932034-34160-60684122265682/ > /dev/null 2>&1 && sleep 0' 34052 1727204415.38756: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204415.38780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.38912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.38938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204415.38955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.39057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.41343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204415.41558: stderr chunk (state=3): >>><<< 34052 1727204415.41562: stdout chunk (state=3): >>><<< 34052 1727204415.41567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204415.41570: handler run complete 34052 1727204415.41833: variable 'ansible_facts' from source: unknown 34052 1727204415.42173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.42424: variable 'ansible_facts' from source: unknown 34052 1727204415.42530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.42703: attempt loop complete, returning result 34052 1727204415.42707: _execute() done 34052 1727204415.42710: dumping result to json 34052 1727204415.42742: done dumping result, returning 34052 1727204415.42752: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [127b8e07-fff9-66a4-e2a3-0000000000b9] 34052 1727204415.42758: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b9 ok: [managed-node1] 34052 1727204415.43979: no more pending results, returning what we have 34052 1727204415.43983: results queue empty 34052 1727204415.43984: checking for any_errors_fatal 34052 1727204415.43986: done checking for any_errors_fatal 34052 1727204415.43986: checking for max_fail_percentage 34052 1727204415.43988: done checking for max_fail_percentage 34052 1727204415.43989: checking to see if all hosts have failed and the running result is not ok 34052 1727204415.43990: done checking to see if all hosts have failed 34052 1727204415.43991: getting the remaining hosts for this loop 34052 1727204415.43992: done getting the remaining hosts for this loop 34052 1727204415.43996: getting the next task for host managed-node1 34052 1727204415.44002: done getting next task for host managed-node1 34052 1727204415.44004: ^ task is: TASK: meta (flush_handlers) 34052 1727204415.44006: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204415.44011: getting variables 34052 1727204415.44012: in VariableManager get_vars() 34052 1727204415.44036: Calling all_inventory to load vars for managed-node1 34052 1727204415.44039: Calling groups_inventory to load vars for managed-node1 34052 1727204415.44042: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204415.44051: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b9 34052 1727204415.44054: WORKER PROCESS EXITING 34052 1727204415.44064: Calling all_plugins_play to load vars for managed-node1 34052 1727204415.44070: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204415.44074: Calling groups_plugins_play to load vars for managed-node1 34052 1727204415.44289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.44543: done with get_vars() 34052 1727204415.44556: done getting variables 34052 1727204415.44632: in VariableManager get_vars() 34052 1727204415.44644: Calling all_inventory to load vars for managed-node1 34052 1727204415.44646: Calling groups_inventory to load vars for managed-node1 34052 1727204415.44649: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204415.44654: Calling all_plugins_play to load vars for managed-node1 34052 1727204415.44657: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204415.44660: Calling groups_plugins_play to load vars for managed-node1 34052 1727204415.44832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.45059: done with get_vars() 34052 1727204415.45078: done queuing things up, now waiting for results queue to drain 34052 1727204415.45080: results queue empty 34052 1727204415.45081: checking for any_errors_fatal 34052 1727204415.45084: done checking for any_errors_fatal 34052 1727204415.45085: checking for max_fail_percentage 34052 1727204415.45086: done checking for max_fail_percentage 34052 1727204415.45087: checking to see if all hosts have failed and the running result is not ok 34052 1727204415.45093: done checking to see if all hosts have failed 34052 1727204415.45094: getting the remaining hosts for this loop 34052 1727204415.45095: done getting the remaining hosts for this loop 34052 1727204415.45098: getting the next task for host managed-node1 34052 1727204415.45103: done getting next task for host managed-node1 34052 1727204415.45106: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34052 1727204415.45108: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204415.45110: getting variables 34052 1727204415.45111: in VariableManager get_vars() 34052 1727204415.45120: Calling all_inventory to load vars for managed-node1 34052 1727204415.45122: Calling groups_inventory to load vars for managed-node1 34052 1727204415.45127: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204415.45133: Calling all_plugins_play to load vars for managed-node1 34052 1727204415.45136: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204415.45139: Calling groups_plugins_play to load vars for managed-node1 34052 1727204415.45307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.45571: done with get_vars() 34052 1727204415.45581: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Tuesday 24 September 2024 15:00:15 -0400 (0:00:01.741) 0:00:01.773 ***** 34052 1727204415.45674: entering _queue_task() for managed-node1/include_tasks 34052 1727204415.45677: Creating lock for include_tasks 34052 1727204415.46050: worker is 1 (out of 1 available) 34052 1727204415.46064: exiting _queue_task() for managed-node1/include_tasks 34052 1727204415.46487: done queuing things up, now waiting for results queue to drain 34052 1727204415.46490: waiting for pending results... 34052 1727204415.46671: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 34052 1727204415.46789: in run() - task 127b8e07-fff9-66a4-e2a3-000000000006 34052 1727204415.46894: variable 'ansible_search_path' from source: unknown 34052 1727204415.47062: calling self._execute() 34052 1727204415.47274: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204415.47290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204415.47407: variable 'omit' from source: magic vars 34052 1727204415.47553: _execute() done 34052 1727204415.47562: dumping result to json 34052 1727204415.47573: done dumping result, returning 34052 1727204415.47587: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-66a4-e2a3-000000000006] 34052 1727204415.47615: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000006 34052 1727204415.47813: no more pending results, returning what we have 34052 1727204415.47819: in VariableManager get_vars() 34052 1727204415.47864: Calling all_inventory to load vars for managed-node1 34052 1727204415.47870: Calling groups_inventory to load vars for managed-node1 34052 1727204415.47874: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204415.47893: Calling all_plugins_play to load vars for managed-node1 34052 1727204415.47896: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204415.47901: Calling groups_plugins_play to load vars for managed-node1 34052 1727204415.48445: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000006 34052 1727204415.48449: WORKER PROCESS EXITING 34052 1727204415.48489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.48717: done with get_vars() 34052 1727204415.48730: variable 'ansible_search_path' from source: unknown 34052 1727204415.48748: we have included files to process 34052 1727204415.48749: generating all_blocks data 34052 1727204415.48750: done generating all_blocks data 34052 1727204415.48751: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34052 1727204415.48752: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34052 1727204415.48755: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34052 1727204415.49513: in VariableManager get_vars() 34052 1727204415.49537: done with get_vars() 34052 1727204415.49553: done processing included file 34052 1727204415.49556: iterating over new_blocks loaded from include file 34052 1727204415.49558: in VariableManager get_vars() 34052 1727204415.49572: done with get_vars() 34052 1727204415.49574: filtering new block on tags 34052 1727204415.49592: done filtering new block on tags 34052 1727204415.49595: in VariableManager get_vars() 34052 1727204415.49608: done with get_vars() 34052 1727204415.49610: filtering new block on tags 34052 1727204415.49631: done filtering new block on tags 34052 1727204415.49635: in VariableManager get_vars() 34052 1727204415.49645: done with get_vars() 34052 1727204415.49647: filtering new block on tags 34052 1727204415.49660: done filtering new block on tags 34052 1727204415.49662: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 34052 1727204415.49671: extending task lists for all hosts with included blocks 34052 1727204415.49728: done extending task lists 34052 1727204415.49729: done processing included files 34052 1727204415.49730: results queue empty 34052 1727204415.49731: checking for any_errors_fatal 34052 1727204415.49732: done checking for any_errors_fatal 34052 1727204415.49733: checking for max_fail_percentage 34052 1727204415.49734: done checking for max_fail_percentage 34052 1727204415.49735: checking to see if all hosts have failed and the running result is not ok 34052 1727204415.49736: done checking to see if all hosts have failed 34052 1727204415.49737: getting the remaining hosts for this loop 34052 1727204415.49738: done getting the remaining hosts for this loop 34052 1727204415.49740: getting the next task for host managed-node1 34052 1727204415.49744: done getting next task for host managed-node1 34052 1727204415.49747: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34052 1727204415.49749: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204415.49751: getting variables 34052 1727204415.49752: in VariableManager get_vars() 34052 1727204415.49761: Calling all_inventory to load vars for managed-node1 34052 1727204415.49764: Calling groups_inventory to load vars for managed-node1 34052 1727204415.49768: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204415.49775: Calling all_plugins_play to load vars for managed-node1 34052 1727204415.49777: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204415.49780: Calling groups_plugins_play to load vars for managed-node1 34052 1727204415.49977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204415.50211: done with get_vars() 34052 1727204415.50222: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:00:15 -0400 (0:00:00.046) 0:00:01.820 ***** 34052 1727204415.50307: entering _queue_task() for managed-node1/setup 34052 1727204415.50662: worker is 1 (out of 1 available) 34052 1727204415.50874: exiting _queue_task() for managed-node1/setup 34052 1727204415.50886: done queuing things up, now waiting for results queue to drain 34052 1727204415.50888: waiting for pending results... 34052 1727204415.50963: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 34052 1727204415.51090: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000ca 34052 1727204415.51115: variable 'ansible_search_path' from source: unknown 34052 1727204415.51122: variable 'ansible_search_path' from source: unknown 34052 1727204415.51168: calling self._execute() 34052 1727204415.51257: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204415.51273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204415.51287: variable 'omit' from source: magic vars 34052 1727204415.51902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204415.55657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204415.55915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204415.56075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204415.56108: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204415.56145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204415.56240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204415.56275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204415.56314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204415.56372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204415.56397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204415.56880: variable 'ansible_facts' from source: unknown 34052 1727204415.57104: variable 'network_test_required_facts' from source: task vars 34052 1727204415.57160: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34052 1727204415.57284: variable 'omit' from source: magic vars 34052 1727204415.57334: variable 'omit' from source: magic vars 34052 1727204415.57516: variable 'omit' from source: magic vars 34052 1727204415.57672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204415.57676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204415.57679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204415.57681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204415.57684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204415.57775: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204415.57786: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204415.57977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204415.58272: Set connection var ansible_connection to ssh 34052 1727204415.58276: Set connection var ansible_timeout to 10 34052 1727204415.58278: Set connection var ansible_pipelining to False 34052 1727204415.58281: Set connection var ansible_shell_type to sh 34052 1727204415.58284: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204415.58286: Set connection var ansible_shell_executable to /bin/sh 34052 1727204415.58289: variable 'ansible_shell_executable' from source: unknown 34052 1727204415.58292: variable 'ansible_connection' from source: unknown 34052 1727204415.58295: variable 'ansible_module_compression' from source: unknown 34052 1727204415.58298: variable 'ansible_shell_type' from source: unknown 34052 1727204415.58300: variable 'ansible_shell_executable' from source: unknown 34052 1727204415.58308: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204415.58318: variable 'ansible_pipelining' from source: unknown 34052 1727204415.58324: variable 'ansible_timeout' from source: unknown 34052 1727204415.58335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204415.58714: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204415.58737: variable 'omit' from source: magic vars 34052 1727204415.58749: starting attempt loop 34052 1727204415.58756: running the handler 34052 1727204415.58777: _low_level_execute_command(): starting 34052 1727204415.58790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204415.59995: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204415.60299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.60319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204415.60345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.60453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.62984: stdout chunk (state=3): >>>/root <<< 34052 1727204415.63231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204415.63255: stdout chunk (state=3): >>><<< 34052 1727204415.63277: stderr chunk (state=3): >>><<< 34052 1727204415.63311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204415.63345: _low_level_execute_command(): starting 34052 1727204415.63368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752 `" && echo ansible-tmp-1727204415.6332936-34205-119629259460752="` echo /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752 `" ) && sleep 0' 34052 1727204415.64168: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204415.64190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.64205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204415.64254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.64274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204415.64364: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.64394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.64422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204415.64439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.64561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.67447: stdout chunk (state=3): >>>ansible-tmp-1727204415.6332936-34205-119629259460752=/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752 <<< 34052 1727204415.67700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204415.67741: stderr chunk (state=3): >>><<< 34052 1727204415.67757: stdout chunk (state=3): >>><<< 34052 1727204415.67786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204415.6332936-34205-119629259460752=/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204415.67861: variable 'ansible_module_compression' from source: unknown 34052 1727204415.67938: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34052 1727204415.68021: variable 'ansible_facts' from source: unknown 34052 1727204415.68327: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py 34052 1727204415.68474: Sending initial data 34052 1727204415.68605: Sent initial data (154 bytes) 34052 1727204415.69312: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.69408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.69469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.69491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204415.69546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.69615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.71959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204415.72013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204415.72017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py" <<< 34052 1727204415.72020: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpq0ohr98y /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py <<< 34052 1727204415.72264: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpq0ohr98y" to remote "/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py" <<< 34052 1727204415.74519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204415.74690: stderr chunk (state=3): >>><<< 34052 1727204415.74701: stdout chunk (state=3): >>><<< 34052 1727204415.74763: done transferring module to remote 34052 1727204415.74768: _low_level_execute_command(): starting 34052 1727204415.74771: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/ /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py && sleep 0' 34052 1727204415.75285: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.75289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.75292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204415.75294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.75360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.75363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204415.75364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.75420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.77743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204415.77827: stderr chunk (state=3): >>><<< 34052 1727204415.77830: stdout chunk (state=3): >>><<< 34052 1727204415.77869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204415.77873: _low_level_execute_command(): starting 34052 1727204415.77875: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/AnsiballZ_setup.py && sleep 0' 34052 1727204415.78358: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.78363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204415.78367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204415.78435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204415.78439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204415.78506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204415.81335: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34052 1727204415.81351: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34052 1727204415.81416: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34052 1727204415.81488: stdout chunk (state=3): >>>import 'posix' # <<< 34052 1727204415.81518: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34052 1727204415.81539: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 34052 1727204415.81559: stdout chunk (state=3): >>> # installed zipimport hook <<< 34052 1727204415.81595: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 34052 1727204415.81610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.81632: stdout chunk (state=3): >>>import '_codecs' # <<< 34052 1727204415.81653: stdout chunk (state=3): >>>import 'codecs' # <<< 34052 1727204415.81685: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34052 1727204415.81699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34052 1727204415.81731: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195da4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195d73b30> <<< 34052 1727204415.81764: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34052 1727204415.81787: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195da6ab0> <<< 34052 1727204415.81816: stdout chunk (state=3): >>>import '_signal' # <<< 34052 1727204415.81837: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 34052 1727204415.81884: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34052 1727204415.81986: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34052 1727204415.82010: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34052 1727204415.82064: stdout chunk (state=3): >>>import 'os' # <<< 34052 1727204415.82090: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 34052 1727204415.82122: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 34052 1727204415.82127: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34052 1727204415.82144: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 34052 1727204415.82186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34052 1727204415.82192: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195b991c0> <<< 34052 1727204415.82255: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.82274: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195b9a0c0> <<< 34052 1727204415.82295: stdout chunk (state=3): >>>import 'site' # <<< 34052 1727204415.82329: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34052 1727204415.82755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34052 1727204415.82784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.82800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34052 1727204415.82852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34052 1727204415.82909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34052 1727204415.82913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34052 1727204415.83004: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd7fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bec140> <<< 34052 1727204415.83020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34052 1727204415.83078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34052 1727204415.83123: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34052 1727204415.83180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 34052 1727204415.83242: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c0f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c0ffe0> <<< 34052 1727204415.83311: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195befc20> <<< 34052 1727204415.83344: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bed3a0> <<< 34052 1727204415.83662: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd5160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c338f0> <<< 34052 1727204415.83668: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c32510> <<< 34052 1727204415.83671: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bee240> <<< 34052 1727204415.83673: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c30d70> <<< 34052 1727204415.83721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c60980> <<< 34052 1727204415.83747: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd43e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 34052 1727204415.83766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34052 1727204415.83785: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.83812: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c60e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c60ce0> <<< 34052 1727204415.83851: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c610d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd2f00> <<< 34052 1727204415.83892: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.84017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34052 1727204415.84023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c617c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c61490> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 34052 1727204415.84049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c626c0> <<< 34052 1727204415.84267: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 34052 1727204415.84297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7c8c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7dfd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7ee10> <<< 34052 1727204415.84331: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.84355: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7f440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7e360> <<< 34052 1727204415.84376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34052 1727204415.84426: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.84439: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7f530> <<< 34052 1727204415.84489: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c626f0> <<< 34052 1727204415.84505: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34052 1727204415.84538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34052 1727204415.84551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34052 1727204415.84576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34052 1727204415.84623: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959bfcb0> <<< 34052 1727204415.84653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34052 1727204415.84697: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959e8530> <<< 34052 1727204415.84710: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e8800> <<< 34052 1727204415.84843: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959bde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34052 1727204415.84916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34052 1727204415.84949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959ea060> <<< 34052 1727204415.84982: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959e8d10> <<< 34052 1727204415.84998: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c62de0> <<< 34052 1727204415.85019: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34052 1727204415.85082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.85116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34052 1727204415.85144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34052 1727204415.85182: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a16390> <<< 34052 1727204415.85232: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34052 1727204415.85255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.85291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34052 1727204415.85301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34052 1727204415.85474: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2e540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34052 1727204415.85478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34052 1727204415.85528: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a672f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34052 1727204415.85559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34052 1727204415.85586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34052 1727204415.85633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34052 1727204415.85770: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a8da90> <<< 34052 1727204415.85803: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a67410> <<< 34052 1727204415.85864: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2f1d0> <<< 34052 1727204415.85896: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958644a0> <<< 34052 1727204415.85911: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959eaf30> <<< 34052 1727204415.86185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0195864740> <<< 34052 1727204415.86303: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_t0yldh0p/ansible_setup_payload.zip' # zipimport: zlib available <<< 34052 1727204415.86447: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.86472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34052 1727204415.86541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34052 1727204415.86670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958d2270> import '_typing' # <<< 34052 1727204415.86877: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958a9160> <<< 34052 1727204415.86937: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958a8320> # zipimport: zlib available <<< 34052 1727204415.86970: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204415.86990: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 34052 1727204415.88622: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.90032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958ab680> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34052 1727204415.90100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195901ca0> <<< 34052 1727204415.90117: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901a30> <<< 34052 1727204415.90157: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901340> <<< 34052 1727204415.90177: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34052 1727204415.90206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34052 1727204415.90241: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901d90> <<< 34052 1727204415.90244: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958d2f00> import 'atexit' # <<< 34052 1727204415.90276: stdout chunk (state=3): >>> <<< 34052 1727204415.90286: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.90310: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959029c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195902c00> <<< 34052 1727204415.90336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34052 1727204415.90395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34052 1727204415.90399: stdout chunk (state=3): >>>import '_locale' # <<< 34052 1727204415.90448: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195903050> <<< 34052 1727204415.90456: stdout chunk (state=3): >>>import 'pwd' # <<< 34052 1727204415.90476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34052 1727204415.90508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34052 1727204415.90546: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195764e60> <<< 34052 1727204415.90573: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.90598: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195766a80> <<< 34052 1727204415.90602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34052 1727204415.90631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34052 1727204415.90656: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195767440> <<< 34052 1727204415.90677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34052 1727204415.90707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34052 1727204415.90749: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957685f0> <<< 34052 1727204415.90752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34052 1727204415.90790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34052 1727204415.90812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 34052 1727204415.90822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34052 1727204415.90873: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576b0e0> <<< 34052 1727204415.90916: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019576b200> <<< 34052 1727204415.90953: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957693a0> <<< 34052 1727204415.90971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34052 1727204415.91008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34052 1727204415.91013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 34052 1727204415.91028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34052 1727204415.91040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34052 1727204415.91083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34052 1727204415.91103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576f050> <<< 34052 1727204415.91125: stdout chunk (state=3): >>>import '_tokenize' # <<< 34052 1727204415.91209: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576db20> <<< 34052 1727204415.91229: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576d880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34052 1727204415.91238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34052 1727204415.91314: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576ddf0> <<< 34052 1727204415.91345: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195769820> <<< 34052 1727204415.91379: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.91384: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957b3200> <<< 34052 1727204415.91417: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b3380> <<< 34052 1727204415.91433: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34052 1727204415.91454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34052 1727204415.91473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34052 1727204415.91533: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957b8f50> <<< 34052 1727204415.91545: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b8d10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34052 1727204415.91675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34052 1727204415.91743: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957bb440> <<< 34052 1727204415.91769: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b9640> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34052 1727204415.91810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.91847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34052 1727204415.91850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34052 1727204415.91866: stdout chunk (state=3): >>>import '_string' # <<< 34052 1727204415.91901: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c2c60> <<< 34052 1727204415.92042: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957bb5f0> <<< 34052 1727204415.92125: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3ec0> <<< 34052 1727204415.92160: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3c80> <<< 34052 1727204415.92214: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3fe0> <<< 34052 1727204415.92248: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b3680> <<< 34052 1727204415.92252: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34052 1727204415.92296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34052 1727204415.92328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34052 1727204415.92348: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.92374: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c76e0> <<< 34052 1727204415.92553: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.92556: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c88f0> <<< 34052 1727204415.92593: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c5e50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.92599: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c71d0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c5a90> <<< 34052 1727204415.92619: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.92637: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34052 1727204415.92653: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.92753: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.92868: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.92902: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 34052 1727204415.92913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34052 1727204415.93060: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.93184: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.93843: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.94471: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34052 1727204415.94515: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204415.94555: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.94582: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019564c950> <<< 34052 1727204415.94675: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34052 1727204415.94678: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564d730> <<< 34052 1727204415.94700: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c41a0> <<< 34052 1727204415.94752: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 34052 1727204415.94799: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.94815: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34052 1727204415.94975: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.95172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564c4d0> <<< 34052 1727204415.95176: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.95722: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96229: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96307: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96389: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34052 1727204415.96392: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96432: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96462: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34052 1727204415.96551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96659: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34052 1727204415.96691: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34052 1727204415.96711: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96732: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.96782: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34052 1727204415.96792: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.97049: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.97320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34052 1727204415.97400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34052 1727204415.97403: stdout chunk (state=3): >>>import '_ast' # <<< 34052 1727204415.97488: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564e4b0> # zipimport: zlib available <<< 34052 1727204415.97577: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204415.97682: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34052 1727204415.97711: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34052 1727204415.97820: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.98147: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195656060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01956569f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564f350> # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204415.98436: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204415.98548: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34052 1727204415.98643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 34052 1727204415.98646: stdout chunk (state=3): >>> <<< 34052 1727204415.98812: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204415.98830: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01956557c0><<< 34052 1727204415.98847: stdout chunk (state=3): >>> <<< 34052 1727204415.98924: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195656ae0><<< 34052 1727204415.98933: stdout chunk (state=3): >>> <<< 34052 1727204415.98983: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34052 1727204415.98994: stdout chunk (state=3): >>> <<< 34052 1727204415.99004: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 34052 1727204415.99030: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204415.99043: stdout chunk (state=3): >>> <<< 34052 1727204415.99162: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204415.99169: stdout chunk (state=3): >>> <<< 34052 1727204415.99284: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204415.99289: stdout chunk (state=3): >>> <<< 34052 1727204415.99345: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204415.99349: stdout chunk (state=3): >>> <<< 34052 1727204415.99423: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 34052 1727204415.99438: stdout chunk (state=3): >>> <<< 34052 1727204415.99445: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 34052 1727204415.99462: stdout chunk (state=3): >>> <<< 34052 1727204415.99490: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 34052 1727204415.99493: stdout chunk (state=3): >>> <<< 34052 1727204415.99573: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 34052 1727204415.99579: stdout chunk (state=3): >>> <<< 34052 1727204415.99678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 34052 1727204415.99711: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34052 1727204415.99751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 34052 1727204415.99867: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956eacc0><<< 34052 1727204415.99873: stdout chunk (state=3): >>> <<< 34052 1727204415.99956: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195660980><<< 34052 1727204415.99962: stdout chunk (state=3): >>> <<< 34052 1727204416.00094: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019565eb10><<< 34052 1727204416.00110: stdout chunk (state=3): >>> <<< 34052 1727204416.00122: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019565e960> <<< 34052 1727204416.00137: stdout chunk (state=3): >>># destroy ansible.module_utils.distro<<< 34052 1727204416.00152: stdout chunk (state=3): >>> <<< 34052 1727204416.00159: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 34052 1727204416.00233: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204416.00236: stdout chunk (state=3): >>> <<< 34052 1727204416.00281: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34052 1727204416.00289: stdout chunk (state=3): >>> <<< 34052 1727204416.00299: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34052 1727204416.00400: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 34052 1727204416.00459: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204416.00482: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 34052 1727204416.00488: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.00559: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.00647: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.00669: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.00691: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.00937: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204416.01045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34052 1727204416.01118: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.01124: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204416.01182: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34052 1727204416.01452: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.01673: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.01689: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.01759: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204416.01813: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34052 1727204416.01817: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34052 1727204416.01856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34052 1727204416.01890: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956edbb0> <<< 34052 1727204416.01933: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34052 1727204416.01936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34052 1727204416.02013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34052 1727204416.02022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34052 1727204416.02101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195128350> <<< 34052 1727204416.02105: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195128920> <<< 34052 1727204416.02154: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956cd3d0> <<< 34052 1727204416.02189: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956cc350> <<< 34052 1727204416.02210: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956ec290> <<< 34052 1727204416.02212: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956efe00> <<< 34052 1727204416.02251: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34052 1727204416.02550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019512b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019512b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512a360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34052 1727204416.02734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512b740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34052 1727204416.02755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34052 1727204416.02794: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.02800: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195192240> <<< 34052 1727204416.02847: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195190260><<< 34052 1727204416.02896: stdout chunk (state=3): >>> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956effb0><<< 34052 1727204416.02905: stdout chunk (state=3): >>> <<< 34052 1727204416.02918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 34052 1727204416.02932: stdout chunk (state=3): >>> <<< 34052 1727204416.02984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available<<< 34052 1727204416.02987: stdout chunk (state=3): >>> <<< 34052 1727204416.03013: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34052 1727204416.03045: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.03158: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.03164: stdout chunk (state=3): >>> <<< 34052 1727204416.03264: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34052 1727204416.03271: stdout chunk (state=3): >>> <<< 34052 1727204416.03312: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03318: stdout chunk (state=3): >>> <<< 34052 1727204416.03408: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03415: stdout chunk (state=3): >>> <<< 34052 1727204416.03507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34052 1727204416.03511: stdout chunk (state=3): >>> <<< 34052 1727204416.03538: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03544: stdout chunk (state=3): >>> <<< 34052 1727204416.03597: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available<<< 34052 1727204416.03602: stdout chunk (state=3): >>> <<< 34052 1727204416.03653: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03706: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.apparmor' # <<< 34052 1727204416.03743: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.03832: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03913: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.caps' # <<< 34052 1727204416.03919: stdout chunk (state=3): >>> <<< 34052 1727204416.03946: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.03955: stdout chunk (state=3): >>> <<< 34052 1727204416.04030: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.04034: stdout chunk (state=3): >>> <<< 34052 1727204416.04110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34052 1727204416.04113: stdout chunk (state=3): >>> <<< 34052 1727204416.04230: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.04248: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.04357: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.04362: stdout chunk (state=3): >>> <<< 34052 1727204416.04468: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.04575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34052 1727204416.04579: stdout chunk (state=3): >>> <<< 34052 1727204416.04603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 34052 1727204416.04607: stdout chunk (state=3): >>> <<< 34052 1727204416.04728: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.05516: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.06005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34052 1727204416.06017: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.06078: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.06592: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204416.06673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 34052 1727204416.06729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34052 1727204416.06751: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.06760: stdout chunk (state=3): >>> <<< 34052 1727204416.06928: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.07048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py<<< 34052 1727204416.07056: stdout chunk (state=3): >>> <<< 34052 1727204416.07073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc'<<< 34052 1727204416.07123: stdout chunk (state=3): >>> import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951925d0><<< 34052 1727204416.07130: stdout chunk (state=3): >>> <<< 34052 1727204416.07162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 34052 1727204416.07227: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc'<<< 34052 1727204416.07230: stdout chunk (state=3): >>> <<< 34052 1727204416.07447: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195193170><<< 34052 1727204416.07454: stdout chunk (state=3): >>> <<< 34052 1727204416.07473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 34052 1727204416.07481: stdout chunk (state=3): >>> <<< 34052 1727204416.07504: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.07728: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # <<< 34052 1727204416.07830: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34052 1727204416.07923: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.07931: stdout chunk (state=3): >>> <<< 34052 1727204416.08090: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34052 1727204416.08096: stdout chunk (state=3): >>> <<< 34052 1727204416.08119: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.08132: stdout chunk (state=3): >>> <<< 34052 1727204416.08250: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.08256: stdout chunk (state=3): >>> <<< 34052 1727204416.08382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 34052 1727204416.08410: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.08416: stdout chunk (state=3): >>> <<< 34052 1727204416.08561: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py<<< 34052 1727204416.08568: stdout chunk (state=3): >>> <<< 34052 1727204416.08657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc'<<< 34052 1727204416.08664: stdout chunk (state=3): >>> <<< 34052 1727204416.08779: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 34052 1727204416.08786: stdout chunk (state=3): >>> <<< 34052 1727204416.08909: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.09032: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01951c6630> <<< 34052 1727204416.09304: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951ae4e0><<< 34052 1727204416.09314: stdout chunk (state=3): >>> <<< 34052 1727204416.09332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 34052 1727204416.09335: stdout chunk (state=3): >>> <<< 34052 1727204416.09361: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.09369: stdout chunk (state=3): >>> <<< 34052 1727204416.09477: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.09484: stdout chunk (state=3): >>> <<< 34052 1727204416.09580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34052 1727204416.09585: stdout chunk (state=3): >>> <<< 34052 1727204416.09611: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.09615: stdout chunk (state=3): >>> <<< 34052 1727204416.09912: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204416.09921: stdout chunk (state=3): >>> <<< 34052 1727204416.10141: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.10147: stdout chunk (state=3): >>> <<< 34052 1727204416.10411: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 34052 1727204416.10430: stdout chunk (state=3): >>> <<< 34052 1727204416.10434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 34052 1727204416.10467: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.10472: stdout chunk (state=3): >>> <<< 34052 1727204416.10540: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.10545: stdout chunk (state=3): >>> <<< 34052 1727204416.10612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34052 1727204416.10619: stdout chunk (state=3): >>> <<< 34052 1727204416.10646: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.10654: stdout chunk (state=3): >>> <<< 34052 1727204416.10721: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.10724: stdout chunk (state=3): >>> <<< 34052 1727204416.10814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc'<<< 34052 1727204416.10870: stdout chunk (state=3): >>> # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.10910: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.10918: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01951e1fa0><<< 34052 1727204416.10933: stdout chunk (state=3): >>> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951e1be0><<< 34052 1727204416.10951: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.user' # <<< 34052 1727204416.10969: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.11002: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.11010: stdout chunk (state=3): >>> <<< 34052 1727204416.11028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 34052 1727204416.11058: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.11061: stdout chunk (state=3): >>> <<< 34052 1727204416.11183: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 34052 1727204416.11192: stdout chunk (state=3): >>> <<< 34052 1727204416.11215: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.11224: stdout chunk (state=3): >>> <<< 34052 1727204416.11527: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.11534: stdout chunk (state=3): >>> <<< 34052 1727204416.11824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available<<< 34052 1727204416.11931: stdout chunk (state=3): >>> <<< 34052 1727204416.12021: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.12031: stdout chunk (state=3): >>> <<< 34052 1727204416.12208: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.12214: stdout chunk (state=3): >>> <<< 34052 1727204416.12285: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.12291: stdout chunk (state=3): >>> <<< 34052 1727204416.12359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 34052 1727204416.12369: stdout chunk (state=3): >>> <<< 34052 1727204416.12390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 34052 1727204416.12414: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.12455: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.12461: stdout chunk (state=3): >>> <<< 34052 1727204416.12501: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.12625: stdout chunk (state=3): >>> <<< 34052 1727204416.12777: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.13043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 34052 1727204416.13069: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34052 1727204416.13075: stdout chunk (state=3): >>> <<< 34052 1727204416.13105: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.13108: stdout chunk (state=3): >>> <<< 34052 1727204416.13345: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.13349: stdout chunk (state=3): >>> <<< 34052 1727204416.13572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34052 1727204416.13604: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.13669: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.13675: stdout chunk (state=3): >>> <<< 34052 1727204416.13738: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.13744: stdout chunk (state=3): >>> <<< 34052 1727204416.14806: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.15832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 34052 1727204416.15862: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34052 1727204416.16142: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.16249: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34052 1727204416.16282: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.16464: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.16656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34052 1727204416.16679: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17146: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34052 1727204416.17317: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17360: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 34052 1727204416.17393: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17467: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.17480: stdout chunk (state=3): >>> <<< 34052 1727204416.17550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34052 1727204416.17577: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.17593: stdout chunk (state=3): >>> <<< 34052 1727204416.17798: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.17945: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.17953: stdout chunk (state=3): >>> <<< 34052 1727204416.18429: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.18724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34052 1727204416.18752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 34052 1727204416.18757: stdout chunk (state=3): >>> <<< 34052 1727204416.18784: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.18852: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.18856: stdout chunk (state=3): >>> <<< 34052 1727204416.18916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 34052 1727204416.18945: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.19021: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 34052 1727204416.19032: stdout chunk (state=3): >>> <<< 34052 1727204416.19042: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19129: stdout chunk (state=3): >>> <<< 34052 1727204416.19184: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19190: stdout chunk (state=3): >>> <<< 34052 1727204416.19315: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34052 1727204416.19318: stdout chunk (state=3): >>> <<< 34052 1727204416.19344: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19351: stdout chunk (state=3): >>> <<< 34052 1727204416.19392: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19398: stdout chunk (state=3): >>> <<< 34052 1727204416.19453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available<<< 34052 1727204416.19461: stdout chunk (state=3): >>> <<< 34052 1727204416.19648: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 34052 1727204416.19667: stdout chunk (state=3): >>> <<< 34052 1727204416.19679: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19686: stdout chunk (state=3): >>> <<< 34052 1727204416.19772: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.19778: stdout chunk (state=3): >>> <<< 34052 1727204416.19892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34052 1727204416.19902: stdout chunk (state=3): >>> <<< 34052 1727204416.19947: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.20435: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.20646: stdout chunk (state=3): >>> <<< 34052 1727204416.20930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34052 1727204416.20960: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.21076: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.21095: stdout chunk (state=3): >>> <<< 34052 1727204416.21167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34052 1727204416.21209: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.21223: stdout chunk (state=3): >>> <<< 34052 1727204416.21271: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.21274: stdout chunk (state=3): >>> <<< 34052 1727204416.21338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34052 1727204416.21369: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.21373: stdout chunk (state=3): >>> <<< 34052 1727204416.21434: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.21448: stdout chunk (state=3): >>> <<< 34052 1727204416.21487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 34052 1727204416.21506: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204416.21533: stdout chunk (state=3): >>> <<< 34052 1727204416.21579: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.21590: stdout chunk (state=3): >>> <<< 34052 1727204416.21651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34052 1727204416.21676: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34052 1727204416.22075: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 34052 1727204416.22093: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.22302: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 34052 1727204416.22326: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34052 1727204416.22406: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.22429: stdout chunk (state=3): >>> <<< 34052 1727204416.22616: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204416.22637: stdout chunk (state=3): >>> <<< 34052 1727204416.22743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 34052 1727204416.22768: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.freebsd' # <<< 34052 1727204416.22824: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available<<< 34052 1727204416.22840: stdout chunk (state=3): >>> <<< 34052 1727204416.23079: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available<<< 34052 1727204416.23305: stdout chunk (state=3): >>> <<< 34052 1727204416.23459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.23827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34052 1727204416.23857: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34052 1727204416.23951: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.23961: stdout chunk (state=3): >>> <<< 34052 1727204416.24034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34052 1727204416.24070: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.24073: stdout chunk (state=3): >>> <<< 34052 1727204416.24157: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.24160: stdout chunk (state=3): >>> <<< 34052 1727204416.24245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34052 1727204416.24273: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.24425: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.24445: stdout chunk (state=3): >>> <<< 34052 1727204416.24580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 34052 1727204416.24601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 34052 1727204416.24779: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204416.24793: stdout chunk (state=3): >>> <<< 34052 1727204416.24939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34052 1727204416.24960: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.compat' # <<< 34052 1727204416.24981: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 34052 1727204416.25120: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204416.25130: stdout chunk (state=3): >>> <<< 34052 1727204416.26274: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 34052 1727204416.26279: stdout chunk (state=3): >>> # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0194aba630> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0194ab9340><<< 34052 1727204416.26282: stdout chunk (state=3): >>> <<< 34052 1727204416.26392: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0194ab61b0><<< 34052 1727204416.26512: stdout chunk (state=3): >>> <<< 34052 1727204416.27592: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "16", "epoch": "1727204416", "epoch_int": "1727204416", "date": "2024-09-24", "time": "15:00:16", "iso8601_micro": "2024-09-24T19:00:16.271550Z", "iso8601": "2024-09-24T19:00:16Z", "iso8601_basic": "20240924T150016271550", "iso8601_basic_short": "20240924T150016", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_R<<< 34052 1727204416.27648: stdout chunk (state=3): >>>OLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34052 1727204416.28382: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 34052 1727204416.28425: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 34052 1727204416.28520: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 34052 1727204416.28759: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34052 1727204416.29086: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34052 1727204416.29115: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 34052 1727204416.29180: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34052 1727204416.29282: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34052 1727204416.29305: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34052 1727204416.29379: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34052 1727204416.29430: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 34052 1727204416.29454: stdout chunk (state=3): >>># destroy signal <<< 34052 1727204416.29520: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 34052 1727204416.29621: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 34052 1727204416.29641: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob <<< 34052 1727204416.29731: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep <<< 34052 1727204416.29747: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34052 1727204416.29820: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 34052 1727204416.29842: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 34052 1727204416.29888: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 34052 1727204416.29912: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34052 1727204416.30187: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid <<< 34052 1727204416.30294: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34052 1727204416.30334: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34052 1727204416.30447: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback <<< 34052 1727204416.30501: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34052 1727204416.30539: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 34052 1727204416.30579: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 34052 1727204416.30733: stdout chunk (state=3): >>># clear sys.audit hooks <<< 34052 1727204416.31328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204416.31333: stdout chunk (state=3): >>><<< 34052 1727204416.31335: stderr chunk (state=3): >>><<< 34052 1727204416.31694: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195da4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195d73b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195da6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195b991c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195b9a0c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd7fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bec140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c0f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c0ffe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195befc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bed3a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd5160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c338f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c32510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bee240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c30d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c60980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd43e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c60e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c60ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c610d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195bd2f00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c617c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c61490> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c626c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7c8c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7dfd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7ee10> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7f440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7e360> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195c7fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c7f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c626f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959bfcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959e8530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e8800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959e89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959bde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959ea060> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959e8d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195c62de0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a16390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2e540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a672f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a8da90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a67410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958644a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195a2d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01959eaf30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0195864740> # zipimport: found 103 names in '/tmp/ansible_setup_payload_t0yldh0p/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958d2270> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958a9160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958a8320> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958ab680> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195901ca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901a30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195901d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01958d2f00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01959029c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195902c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195903050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195764e60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195766a80> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195767440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957685f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576b0e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019576b200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957693a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576f050> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576db20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576d880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019576ddf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195769820> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957b3200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b3380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957b8f50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b8d10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957bb440> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b9640> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c2c60> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957bb5f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3ec0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3c80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c3fe0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957b3680> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c76e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c88f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c5e50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01957c71d0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c5a90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019564c950> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564d730> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01957c41a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564c4d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564e4b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195656060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01956569f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019564f350> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01956557c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195656ae0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956eacc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195660980> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019565eb10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019565e960> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956edbb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195128350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195128920> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956cd3d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956cc350> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956ec290> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956efe00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019512b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f019512b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512a360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f019512b740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0195192240> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195190260> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01956effb0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951925d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0195193170> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01951c6630> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951ae4e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01951e1fa0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01951e1be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0194aba630> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0194ab9340> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0194ab61b0> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "16", "epoch": "1727204416", "epoch_int": "1727204416", "date": "2024-09-24", "time": "15:00:16", "iso8601_micro": "2024-09-24T19:00:16.271550Z", "iso8601": "2024-09-24T19:00:16Z", "iso8601_basic": "20240924T150016271550", "iso8601_basic_short": "20240924T150016", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34052 1727204416.33211: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204416.33215: _low_level_execute_command(): starting 34052 1727204416.33218: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204415.6332936-34205-119629259460752/ > /dev/null 2>&1 && sleep 0' 34052 1727204416.33473: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204416.33585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204416.33600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204416.33704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204416.33869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.33935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204416.36995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204416.37005: stdout chunk (state=3): >>><<< 34052 1727204416.37205: stderr chunk (state=3): >>><<< 34052 1727204416.37210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204416.37213: handler run complete 34052 1727204416.37215: variable 'ansible_facts' from source: unknown 34052 1727204416.37219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204416.37681: variable 'ansible_facts' from source: unknown 34052 1727204416.37983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204416.38379: attempt loop complete, returning result 34052 1727204416.38383: _execute() done 34052 1727204416.38386: dumping result to json 34052 1727204416.38389: done dumping result, returning 34052 1727204416.38391: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-66a4-e2a3-0000000000ca] 34052 1727204416.38393: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ca 34052 1727204416.38600: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ca 34052 1727204416.38604: WORKER PROCESS EXITING ok: [managed-node1] 34052 1727204416.38911: no more pending results, returning what we have 34052 1727204416.38914: results queue empty 34052 1727204416.38915: checking for any_errors_fatal 34052 1727204416.38917: done checking for any_errors_fatal 34052 1727204416.38917: checking for max_fail_percentage 34052 1727204416.38919: done checking for max_fail_percentage 34052 1727204416.38920: checking to see if all hosts have failed and the running result is not ok 34052 1727204416.38921: done checking to see if all hosts have failed 34052 1727204416.38921: getting the remaining hosts for this loop 34052 1727204416.38923: done getting the remaining hosts for this loop 34052 1727204416.38927: getting the next task for host managed-node1 34052 1727204416.38935: done getting next task for host managed-node1 34052 1727204416.38938: ^ task is: TASK: Check if system is ostree 34052 1727204416.38941: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204416.38944: getting variables 34052 1727204416.38946: in VariableManager get_vars() 34052 1727204416.38981: Calling all_inventory to load vars for managed-node1 34052 1727204416.38984: Calling groups_inventory to load vars for managed-node1 34052 1727204416.38988: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204416.38998: Calling all_plugins_play to load vars for managed-node1 34052 1727204416.39001: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204416.39004: Calling groups_plugins_play to load vars for managed-node1 34052 1727204416.39305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204416.39519: done with get_vars() 34052 1727204416.39532: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:00:16 -0400 (0:00:00.893) 0:00:02.713 ***** 34052 1727204416.39662: entering _queue_task() for managed-node1/stat 34052 1727204416.40245: worker is 1 (out of 1 available) 34052 1727204416.40260: exiting _queue_task() for managed-node1/stat 34052 1727204416.40275: done queuing things up, now waiting for results queue to drain 34052 1727204416.40277: waiting for pending results... 34052 1727204416.40576: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 34052 1727204416.40849: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000cc 34052 1727204416.40854: variable 'ansible_search_path' from source: unknown 34052 1727204416.40857: variable 'ansible_search_path' from source: unknown 34052 1727204416.40861: calling self._execute() 34052 1727204416.40990: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204416.41005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204416.41020: variable 'omit' from source: magic vars 34052 1727204416.41680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204416.42276: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204416.42491: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204416.42801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204416.42808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204416.43091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204416.43126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204416.43203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204416.43634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204416.43637: Evaluated conditional (not __network_is_ostree is defined): True 34052 1727204416.43640: variable 'omit' from source: magic vars 34052 1727204416.43671: variable 'omit' from source: magic vars 34052 1727204416.43975: variable 'omit' from source: magic vars 34052 1727204416.43978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204416.44007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204416.44036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204416.44059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204416.44096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204416.44175: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204416.44186: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204416.44198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204416.44309: Set connection var ansible_connection to ssh 34052 1727204416.44323: Set connection var ansible_timeout to 10 34052 1727204416.44411: Set connection var ansible_pipelining to False 34052 1727204416.44414: Set connection var ansible_shell_type to sh 34052 1727204416.44417: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204416.44419: Set connection var ansible_shell_executable to /bin/sh 34052 1727204416.44421: variable 'ansible_shell_executable' from source: unknown 34052 1727204416.44423: variable 'ansible_connection' from source: unknown 34052 1727204416.44426: variable 'ansible_module_compression' from source: unknown 34052 1727204416.44428: variable 'ansible_shell_type' from source: unknown 34052 1727204416.44429: variable 'ansible_shell_executable' from source: unknown 34052 1727204416.44431: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204416.44434: variable 'ansible_pipelining' from source: unknown 34052 1727204416.44436: variable 'ansible_timeout' from source: unknown 34052 1727204416.44438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204416.44612: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204416.44639: variable 'omit' from source: magic vars 34052 1727204416.44649: starting attempt loop 34052 1727204416.44655: running the handler 34052 1727204416.44677: _low_level_execute_command(): starting 34052 1727204416.44690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204416.45523: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204416.45543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204416.45626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204416.45669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204416.45694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204416.45851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.45916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204416.48020: stdout chunk (state=3): >>>/root <<< 34052 1727204416.48025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204416.48027: stdout chunk (state=3): >>><<< 34052 1727204416.48030: stderr chunk (state=3): >>><<< 34052 1727204416.48034: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204416.48045: _low_level_execute_command(): starting 34052 1727204416.48238: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479 `" && echo ansible-tmp-1727204416.4794915-34245-75645019617479="` echo /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479 `" ) && sleep 0' 34052 1727204416.49509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204416.49516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204416.49519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204416.49522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.49609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204416.52310: stdout chunk (state=3): >>>ansible-tmp-1727204416.4794915-34245-75645019617479=/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479 <<< 34052 1727204416.52531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204416.53006: stderr chunk (state=3): >>><<< 34052 1727204416.53010: stdout chunk (state=3): >>><<< 34052 1727204416.53013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204416.4794915-34245-75645019617479=/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204416.53016: variable 'ansible_module_compression' from source: unknown 34052 1727204416.53018: ANSIBALLZ: Using lock for stat 34052 1727204416.53021: ANSIBALLZ: Acquiring lock 34052 1727204416.53023: ANSIBALLZ: Lock acquired: 140141530568160 34052 1727204416.53028: ANSIBALLZ: Creating module 34052 1727204416.73113: ANSIBALLZ: Writing module into payload 34052 1727204416.73230: ANSIBALLZ: Writing module 34052 1727204416.73267: ANSIBALLZ: Renaming module 34052 1727204416.73287: ANSIBALLZ: Done creating module 34052 1727204416.73310: variable 'ansible_facts' from source: unknown 34052 1727204416.73412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py 34052 1727204416.73630: Sending initial data 34052 1727204416.73640: Sent initial data (152 bytes) 34052 1727204416.74357: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204416.74386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204416.74491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204416.74521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204416.74539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204416.74612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.74699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204416.76496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204416.76553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204416.76692: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpeeqp4aqt /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py <<< 34052 1727204416.76696: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpeeqp4aqt" to remote "/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py" <<< 34052 1727204416.78178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204416.78432: stderr chunk (state=3): >>><<< 34052 1727204416.78708: stdout chunk (state=3): >>><<< 34052 1727204416.78713: done transferring module to remote 34052 1727204416.78715: _low_level_execute_command(): starting 34052 1727204416.78718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/ /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py && sleep 0' 34052 1727204416.80033: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204416.80051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204416.80289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.80340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204416.82461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204416.82485: stderr chunk (state=3): >>><<< 34052 1727204416.82531: stdout chunk (state=3): >>><<< 34052 1727204416.82752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204416.82756: _low_level_execute_command(): starting 34052 1727204416.82758: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/AnsiballZ_stat.py && sleep 0' 34052 1727204416.83941: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204416.84163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204416.84301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204416.84393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204416.86833: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34052 1727204416.86909: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34052 1727204416.86950: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34052 1727204416.87005: stdout chunk (state=3): >>>import 'posix' # <<< 34052 1727204416.87168: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 34052 1727204416.87203: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34052 1727204416.87246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6da4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6d73b30> <<< 34052 1727204416.87276: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34052 1727204416.87286: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6da6ab0> <<< 34052 1727204416.87309: stdout chunk (state=3): >>>import '_signal' # <<< 34052 1727204416.87336: stdout chunk (state=3): >>>import '_abc' # <<< 34052 1727204416.87359: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 34052 1727204416.87433: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34052 1727204416.87496: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34052 1727204416.87559: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 34052 1727204416.87704: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 34052 1727204416.87708: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b551c0> <<< 34052 1727204416.87781: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b560c0> <<< 34052 1727204416.87812: stdout chunk (state=3): >>>import 'site' # <<< 34052 1727204416.87849: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34052 1727204416.88142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34052 1727204416.88146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204416.88213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34052 1727204416.88260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34052 1727204416.88359: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b93fb0> <<< 34052 1727204416.88574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34052 1727204416.88648: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6ba8140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bcb9e0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bcbfb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6babc80> import '_functools' # <<< 34052 1727204416.88697: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6ba93a0> <<< 34052 1727204416.88789: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b91160> <<< 34052 1727204416.88811: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34052 1727204416.88839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34052 1727204416.88870: stdout chunk (state=3): >>>import '_sre' # <<< 34052 1727204416.88874: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34052 1727204416.88904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34052 1727204416.88926: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34052 1727204416.88970: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bef8f0> <<< 34052 1727204416.88982: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bee510> <<< 34052 1727204416.89018: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 34052 1727204416.89063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6baa390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6beca40> <<< 34052 1727204416.89133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34052 1727204416.89226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1c950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b903e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c1ce00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1ccb0> <<< 34052 1727204416.89230: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c1d070> <<< 34052 1727204416.89287: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b8ef00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34052 1727204416.89380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1d730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1d400> <<< 34052 1727204416.89414: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1e630> <<< 34052 1727204416.89444: stdout chunk (state=3): >>>import 'importlib.util' # <<< 34052 1727204416.89710: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34052 1727204416.89714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c38860> <<< 34052 1727204416.89717: stdout chunk (state=3): >>>import 'errno' # <<< 34052 1727204416.89720: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c39fa0> <<< 34052 1727204416.89722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34052 1727204416.89724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34052 1727204416.89726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3ae40> <<< 34052 1727204416.89730: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c3b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3a390> <<< 34052 1727204416.89752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34052 1727204416.89762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34052 1727204416.89814: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c3bef0> <<< 34052 1727204416.89828: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3b620> <<< 34052 1727204416.89883: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1e690> <<< 34052 1727204416.89938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34052 1727204416.89960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34052 1727204416.90040: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d69d7d40> <<< 34052 1727204416.90065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a005c0> <<< 34052 1727204416.90120: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00890> <<< 34052 1727204416.90254: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69d5ee0> <<< 34052 1727204416.90262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34052 1727204416.90296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34052 1727204416.90346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a020f0> <<< 34052 1727204416.90393: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a00d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1ed80> <<< 34052 1727204416.90414: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34052 1727204416.90510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34052 1727204416.90569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34052 1727204416.90583: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a2a480> <<< 34052 1727204416.90632: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34052 1727204416.90661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204416.90688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34052 1727204416.90703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34052 1727204416.90748: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a46600> <<< 34052 1727204416.90768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34052 1727204416.90807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34052 1727204416.90876: stdout chunk (state=3): >>>import 'ntpath' # <<< 34052 1727204416.90916: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a7b3e0> <<< 34052 1727204416.90919: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34052 1727204416.90962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34052 1727204416.90988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34052 1727204416.91041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34052 1727204416.91132: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6aa5b80> <<< 34052 1727204416.91211: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a7b500> <<< 34052 1727204416.91256: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a47290> <<< 34052 1727204416.91294: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68c0440> <<< 34052 1727204416.91316: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a45640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a03020> <<< 34052 1727204416.91420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34052 1727204416.91447: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38d68c06e0> <<< 34052 1727204416.91523: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_8gsx9amc/ansible_stat_payload.zip' <<< 34052 1727204416.91546: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.91685: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.91732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34052 1727204416.91778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34052 1727204416.91875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34052 1727204416.91895: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 34052 1727204416.91901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 34052 1727204416.92157: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d691a270> <<< 34052 1727204416.92160: stdout chunk (state=3): >>>import '_typing' # <<< 34052 1727204416.92368: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f1160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f02c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 34052 1727204416.94919: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204416.96605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f3f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34052 1727204416.96629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34052 1727204416.96635: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.96658: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6945c10> <<< 34052 1727204416.96786: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69459a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69452b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34052 1727204416.96827: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6945d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d691acf0> <<< 34052 1727204416.96849: stdout chunk (state=3): >>>import 'atexit' # <<< 34052 1727204416.96937: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6946900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6946b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34052 1727204416.97010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34052 1727204416.97039: stdout chunk (state=3): >>>import '_locale' # <<< 34052 1727204416.97096: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6946ff0> <<< 34052 1727204416.97112: stdout chunk (state=3): >>>import 'pwd' # <<< 34052 1727204416.97135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34052 1727204416.97178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34052 1727204416.97230: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67a8d10> <<< 34052 1727204416.97324: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67aa930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34052 1727204416.97368: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ab2c0> <<< 34052 1727204416.97390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34052 1727204416.97539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ac470> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34052 1727204416.97553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 34052 1727204416.97561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34052 1727204416.97653: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67aef60> <<< 34052 1727204416.97708: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67af080> <<< 34052 1727204416.97767: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ad250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34052 1727204416.97832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 34052 1727204416.97936: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34052 1727204416.97986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34052 1727204416.98048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b2e70> import '_tokenize' # <<< 34052 1727204416.98149: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b1940> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b16a0> <<< 34052 1727204416.98194: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34052 1727204416.98308: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b3ce0> <<< 34052 1727204416.98424: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ad730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67faff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67fb170><<< 34052 1727204416.98436: stdout chunk (state=3): >>> <<< 34052 1727204416.98559: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6800da0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6800b60> <<< 34052 1727204416.98566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34052 1727204416.98762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34052 1727204416.98819: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.98892: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6803290> <<< 34052 1727204416.98898: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6801490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34052 1727204416.98930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204416.98956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34052 1727204416.98982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34052 1727204416.98992: stdout chunk (state=3): >>>import '_string' # <<< 34052 1727204416.99050: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6806a50> <<< 34052 1727204416.99272: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6803410> <<< 34052 1727204416.99371: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204416.99378: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807800> <<< 34052 1727204416.99492: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67fb470> <<< 34052 1727204416.99514: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34052 1727204416.99633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34052 1727204416.99681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680b380> <<< 34052 1727204416.99999: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680c680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6809b20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680aed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6809730> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34052 1727204417.00040: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.00142: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.00179: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34052 1727204417.00215: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34052 1727204417.00432: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.00488: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.01544: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.02640: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34052 1727204417.02644: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 34052 1727204417.02646: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 34052 1727204417.02698: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34052 1727204417.02737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34052 1727204417.02813: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204417.02821: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204417.02836: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d68947a0> <<< 34052 1727204417.03147: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68955b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d680c5c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34052 1727204417.03424: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.03717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34052 1727204417.03729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34052 1727204417.03749: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6894380> <<< 34052 1727204417.03775: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.04718: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.05604: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.05735: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.05861: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34052 1727204417.05885: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.05944: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.06008: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34052 1727204417.06029: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.06154: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.06309: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34052 1727204417.06533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34052 1727204417.06941: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.07392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34052 1727204417.07504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34052 1727204417.07536: stdout chunk (state=3): >>>import '_ast' # <<< 34052 1727204417.07653: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6897f80> <<< 34052 1727204417.07682: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.07813: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.07938: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34052 1727204417.07942: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 34052 1727204417.07969: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 34052 1727204417.07975: stdout chunk (state=3): >>> <<< 34052 1727204417.08001: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34052 1727204417.08037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34052 1727204417.08062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34052 1727204417.08178: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34052 1727204417.08386: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a20c0> <<< 34052 1727204417.08470: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 34052 1727204417.08476: stdout chunk (state=3): >>> <<< 34052 1727204417.08498: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a2a50><<< 34052 1727204417.08548: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d680f980> # zipimport: zlib available<<< 34052 1727204417.08555: stdout chunk (state=3): >>> <<< 34052 1727204417.08623: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.08693: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 34052 1727204417.08697: stdout chunk (state=3): >>> <<< 34052 1727204417.08727: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.08734: stdout chunk (state=3): >>> <<< 34052 1727204417.08810: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.08818: stdout chunk (state=3): >>> <<< 34052 1727204417.08896: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.09002: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204417.09005: stdout chunk (state=3): >>> <<< 34052 1727204417.09126: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 34052 1727204417.09133: stdout chunk (state=3): >>> <<< 34052 1727204417.09223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 34052 1727204417.09226: stdout chunk (state=3): >>> <<< 34052 1727204417.09374: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 34052 1727204417.09387: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 34052 1727204417.09395: stdout chunk (state=3): >>> <<< 34052 1727204417.09401: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a1760><<< 34052 1727204417.09484: stdout chunk (state=3): >>> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66a2c00><<< 34052 1727204417.09488: stdout chunk (state=3): >>> <<< 34052 1727204417.09540: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34052 1727204417.09559: stdout chunk (state=3): >>> <<< 34052 1727204417.09563: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 34052 1727204417.09598: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.09603: stdout chunk (state=3): >>> <<< 34052 1727204417.09822: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 34052 1727204417.09827: stdout chunk (state=3): >>> <<< 34052 1727204417.09879: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.09884: stdout chunk (state=3): >>> <<< 34052 1727204417.09954: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 34052 1727204417.09972: stdout chunk (state=3): >>> <<< 34052 1727204417.09979: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 34052 1727204417.09991: stdout chunk (state=3): >>> <<< 34052 1727204417.10031: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34052 1727204417.10080: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 34052 1727204417.10085: stdout chunk (state=3): >>> <<< 34052 1727204417.10217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34052 1727204417.10253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 34052 1727204417.10259: stdout chunk (state=3): >>> <<< 34052 1727204417.10300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34052 1727204417.10415: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6732cf0><<< 34052 1727204417.10418: stdout chunk (state=3): >>> <<< 34052 1727204417.10509: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aca40><<< 34052 1727204417.10514: stdout chunk (state=3): >>> <<< 34052 1727204417.10649: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aab10><<< 34052 1727204417.10658: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aa960><<< 34052 1727204417.10737: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34052 1727204417.10743: stdout chunk (state=3): >>># zipimport: zlib available<<< 34052 1727204417.10759: stdout chunk (state=3): >>> <<< 34052 1727204417.10796: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34052 1727204417.10805: stdout chunk (state=3): >>> <<< 34052 1727204417.10820: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34052 1727204417.10825: stdout chunk (state=3): >>> <<< 34052 1727204417.10923: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34052 1727204417.10950: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204417.10981: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34052 1727204417.10989: stdout chunk (state=3): >>> <<< 34052 1727204417.11023: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available<<< 34052 1727204417.11030: stdout chunk (state=3): >>> <<< 34052 1727204417.11426: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.11645: stdout chunk (state=3): >>># zipimport: zlib available <<< 34052 1727204417.11869: stdout chunk (state=3): >>> <<< 34052 1727204417.11884: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}<<< 34052 1727204417.11890: stdout chunk (state=3): >>> <<< 34052 1727204417.11918: stdout chunk (state=3): >>># destroy __main__<<< 34052 1727204417.11931: stdout chunk (state=3): >>> <<< 34052 1727204417.12448: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 34052 1727204417.12451: stdout chunk (state=3): >>> <<< 34052 1727204417.12474: stdout chunk (state=3): >>># clear sys.path_hooks <<< 34052 1727204417.12496: stdout chunk (state=3): >>># clear builtins._ # clear sys.path<<< 34052 1727204417.12514: stdout chunk (state=3): >>> # clear sys.argv <<< 34052 1727204417.12530: stdout chunk (state=3): >>># clear sys.ps1 <<< 34052 1727204417.12538: stdout chunk (state=3): >>># clear sys.ps2<<< 34052 1727204417.12550: stdout chunk (state=3): >>> # clear sys.last_exc<<< 34052 1727204417.12572: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 34052 1727204417.12601: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 34052 1727204417.12625: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat<<< 34052 1727204417.12655: stdout chunk (state=3): >>> # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools<<< 34052 1727204417.12682: stdout chunk (state=3): >>> # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix <<< 34052 1727204417.12718: stdout chunk (state=3): >>># cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch<<< 34052 1727204417.12745: stdout chunk (state=3): >>> # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2<<< 34052 1727204417.12771: stdout chunk (state=3): >>> # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib <<< 34052 1727204417.12793: stdout chunk (state=3): >>># cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing<<< 34052 1727204417.12820: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 34052 1727204417.12833: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl<<< 34052 1727204417.12856: stdout chunk (state=3): >>> # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal <<< 34052 1727204417.12879: stdout chunk (state=3): >>># cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime<<< 34052 1727204417.12903: stdout chunk (state=3): >>> # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 34052 1727204417.13246: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34052 1727204417.13440: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34052 1727204417.13492: stdout chunk (state=3): >>># destroy importlib.machinery<<< 34052 1727204417.13496: stdout chunk (state=3): >>> <<< 34052 1727204417.13498: stdout chunk (state=3): >>># destroy importlib._abc<<< 34052 1727204417.13499: stdout chunk (state=3): >>> # destroy importlib.util<<< 34052 1727204417.13500: stdout chunk (state=3): >>> <<< 34052 1727204417.13533: stdout chunk (state=3): >>># destroy _bz2<<< 34052 1727204417.13539: stdout chunk (state=3): >>> <<< 34052 1727204417.13556: stdout chunk (state=3): >>># destroy _compression <<< 34052 1727204417.13567: stdout chunk (state=3): >>># destroy _lzma <<< 34052 1727204417.13610: stdout chunk (state=3): >>># destroy binascii # destroy struct<<< 34052 1727204417.13625: stdout chunk (state=3): >>> # destroy zlib<<< 34052 1727204417.13644: stdout chunk (state=3): >>> # destroy bz2<<< 34052 1727204417.13655: stdout chunk (state=3): >>> # destroy lzma<<< 34052 1727204417.13674: stdout chunk (state=3): >>> <<< 34052 1727204417.13680: stdout chunk (state=3): >>># destroy zipfile._path <<< 34052 1727204417.13706: stdout chunk (state=3): >>># destroy zipfile<<< 34052 1727204417.13716: stdout chunk (state=3): >>> <<< 34052 1727204417.13725: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress<<< 34052 1727204417.13777: stdout chunk (state=3): >>> # destroy ntpath<<< 34052 1727204417.13783: stdout chunk (state=3): >>> <<< 34052 1727204417.13803: stdout chunk (state=3): >>># destroy importlib <<< 34052 1727204417.13826: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile<<< 34052 1727204417.13840: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 34052 1727204417.13861: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 34052 1727204417.13882: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 34052 1727204417.13908: stdout chunk (state=3): >>> # destroy pwd # destroy locale <<< 34052 1727204417.13934: stdout chunk (state=3): >>># destroy signal # destroy fcntl # destroy select # destroy _signal<<< 34052 1727204417.13952: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 34052 1727204417.13961: stdout chunk (state=3): >>> # destroy uuid<<< 34052 1727204417.13996: stdout chunk (state=3): >>> # destroy selectors<<< 34052 1727204417.14009: stdout chunk (state=3): >>> <<< 34052 1727204417.14016: stdout chunk (state=3): >>># destroy errno # destroy array<<< 34052 1727204417.14040: stdout chunk (state=3): >>> # destroy datetime<<< 34052 1727204417.14048: stdout chunk (state=3): >>> <<< 34052 1727204417.14079: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux<<< 34052 1727204417.14107: stdout chunk (state=3): >>> # destroy shutil # destroy distro<<< 34052 1727204417.14119: stdout chunk (state=3): >>> # destroy distro.distro<<< 34052 1727204417.14196: stdout chunk (state=3): >>> # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 34052 1727204417.14203: stdout chunk (state=3): >>> <<< 34052 1727204417.14222: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 34052 1727204417.14236: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes<<< 34052 1727204417.14257: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 34052 1727204417.14264: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves<<< 34052 1727204417.14283: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 34052 1727204417.14301: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 34052 1727204417.14335: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 34052 1727204417.14347: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 34052 1727204417.14367: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc<<< 34052 1727204417.14378: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 34052 1727204417.14396: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math<<< 34052 1727204417.14407: stdout chunk (state=3): >>> # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 34052 1727204417.14430: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 34052 1727204417.14443: stdout chunk (state=3): >>># destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 34052 1727204417.14469: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 34052 1727204417.14486: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 34052 1727204417.14495: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc<<< 34052 1727204417.14505: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 34052 1727204417.14537: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 34052 1727204417.14545: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 34052 1727204417.14570: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc<<< 34052 1727204417.14580: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 34052 1727204417.14604: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 34052 1727204417.14616: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 34052 1727204417.14648: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 34052 1727204417.14651: stdout chunk (state=3): >>> <<< 34052 1727204417.14829: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34052 1727204417.14936: stdout chunk (state=3): >>># destroy sys.monitoring<<< 34052 1727204417.14941: stdout chunk (state=3): >>> <<< 34052 1727204417.14955: stdout chunk (state=3): >>># destroy _socket<<< 34052 1727204417.14958: stdout chunk (state=3): >>> <<< 34052 1727204417.14991: stdout chunk (state=3): >>># destroy _collections<<< 34052 1727204417.14998: stdout chunk (state=3): >>> <<< 34052 1727204417.15046: stdout chunk (state=3): >>># destroy platform<<< 34052 1727204417.15057: stdout chunk (state=3): >>> <<< 34052 1727204417.15060: stdout chunk (state=3): >>># destroy _uuid<<< 34052 1727204417.15073: stdout chunk (state=3): >>> <<< 34052 1727204417.15077: stdout chunk (state=3): >>># destroy stat <<< 34052 1727204417.15086: stdout chunk (state=3): >>># destroy genericpath<<< 34052 1727204417.15106: stdout chunk (state=3): >>> <<< 34052 1727204417.15113: stdout chunk (state=3): >>># destroy re._parser <<< 34052 1727204417.15128: stdout chunk (state=3): >>># destroy tokenize <<< 34052 1727204417.15164: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 34052 1727204417.15183: stdout chunk (state=3): >>> <<< 34052 1727204417.15188: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib<<< 34052 1727204417.15252: stdout chunk (state=3): >>> # destroy _typing <<< 34052 1727204417.15268: stdout chunk (state=3): >>># destroy _tokenize <<< 34052 1727204417.15277: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse<<< 34052 1727204417.15295: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 34052 1727204417.15360: stdout chunk (state=3): >>> # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path <<< 34052 1727204417.15367: stdout chunk (state=3): >>># clear sys.modules<<< 34052 1727204417.15380: stdout chunk (state=3): >>> # destroy _frozen_importlib<<< 34052 1727204417.15427: stdout chunk (state=3): >>> <<< 34052 1727204417.15515: stdout chunk (state=3): >>># destroy codecs<<< 34052 1727204417.15538: stdout chunk (state=3): >>> # destroy encodings.aliases<<< 34052 1727204417.15544: stdout chunk (state=3): >>> # destroy encodings.utf_8 # destroy encodings.utf_8_sig<<< 34052 1727204417.15574: stdout chunk (state=3): >>> # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback <<< 34052 1727204417.15586: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 34052 1727204417.15616: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 34052 1727204417.15659: stdout chunk (state=3): >>> # destroy _random<<< 34052 1727204417.15666: stdout chunk (state=3): >>> <<< 34052 1727204417.15683: stdout chunk (state=3): >>># destroy _weakref <<< 34052 1727204417.15749: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools<<< 34052 1727204417.15752: stdout chunk (state=3): >>> <<< 34052 1727204417.15772: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix<<< 34052 1727204417.15811: stdout chunk (state=3): >>> # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 34052 1727204417.15815: stdout chunk (state=3): >>> <<< 34052 1727204417.16372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204417.16442: stderr chunk (state=3): >>><<< 34052 1727204417.16445: stdout chunk (state=3): >>><<< 34052 1727204417.16506: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6da4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6d73b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6da6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b551c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b560c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b93fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6ba8140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bcb9e0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bcbfb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6babc80> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6ba93a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b91160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bef8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6bee510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6baa390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6beca40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1c950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b903e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c1ce00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1ccb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c1d070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6b8ef00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1d730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1d400> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1e630> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c38860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c39fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3ae40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c3b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3a390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6c3bef0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c3b620> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1e690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d69d7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a005c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6a00a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69d5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a020f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a00d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6c1ed80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a2a480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a46600> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a7b3e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6aa5b80> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a7b500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a47290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68c0440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a45640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6a03020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38d68c06e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_8gsx9amc/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d691a270> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f1160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f02c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68f3f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6945c10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69459a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d69452b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6945d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d691acf0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6946900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6946b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6946ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67a8d10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67aa930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ab2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ac470> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67aef60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67af080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ad250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b2e70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b1940> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b16a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67b3ce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67ad730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d67faff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67fb170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6800da0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6800b60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6803290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6801490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6806a50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6803410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d6807b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d67fb470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680b380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680c680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6809b20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d680aed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6809730> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d68947a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d68955b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d680c5c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6894380> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6897f80> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a20c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a2a50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d680f980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38d66a1760> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66a2c00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d6732cf0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aca40> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aab10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38d66aa960> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34052 1727204417.17070: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204417.17074: _low_level_execute_command(): starting 34052 1727204417.17076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204416.4794915-34245-75645019617479/ > /dev/null 2>&1 && sleep 0' 34052 1727204417.17237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204417.17240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204417.17243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204417.17302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204417.17305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204417.17316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.17369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204417.20102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204417.20169: stderr chunk (state=3): >>><<< 34052 1727204417.20173: stdout chunk (state=3): >>><<< 34052 1727204417.20189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204417.20195: handler run complete 34052 1727204417.20211: attempt loop complete, returning result 34052 1727204417.20214: _execute() done 34052 1727204417.20216: dumping result to json 34052 1727204417.20220: done dumping result, returning 34052 1727204417.20230: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [127b8e07-fff9-66a4-e2a3-0000000000cc] 34052 1727204417.20232: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cc 34052 1727204417.20335: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cc 34052 1727204417.20337: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 34052 1727204417.20418: no more pending results, returning what we have 34052 1727204417.20421: results queue empty 34052 1727204417.20422: checking for any_errors_fatal 34052 1727204417.20431: done checking for any_errors_fatal 34052 1727204417.20432: checking for max_fail_percentage 34052 1727204417.20433: done checking for max_fail_percentage 34052 1727204417.20434: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.20435: done checking to see if all hosts have failed 34052 1727204417.20436: getting the remaining hosts for this loop 34052 1727204417.20438: done getting the remaining hosts for this loop 34052 1727204417.20442: getting the next task for host managed-node1 34052 1727204417.20449: done getting next task for host managed-node1 34052 1727204417.20452: ^ task is: TASK: Set flag to indicate system is ostree 34052 1727204417.20455: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.20458: getting variables 34052 1727204417.20459: in VariableManager get_vars() 34052 1727204417.20491: Calling all_inventory to load vars for managed-node1 34052 1727204417.20493: Calling groups_inventory to load vars for managed-node1 34052 1727204417.20497: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.20508: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.20511: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.20514: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.20754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.21046: done with get_vars() 34052 1727204417.21059: done getting variables 34052 1727204417.21177: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.815) 0:00:03.529 ***** 34052 1727204417.21210: entering _queue_task() for managed-node1/set_fact 34052 1727204417.21212: Creating lock for set_fact 34052 1727204417.21633: worker is 1 (out of 1 available) 34052 1727204417.21648: exiting _queue_task() for managed-node1/set_fact 34052 1727204417.21806: done queuing things up, now waiting for results queue to drain 34052 1727204417.21808: waiting for pending results... 34052 1727204417.22039: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 34052 1727204417.22137: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000cd 34052 1727204417.22140: variable 'ansible_search_path' from source: unknown 34052 1727204417.22143: variable 'ansible_search_path' from source: unknown 34052 1727204417.22145: calling self._execute() 34052 1727204417.22213: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.22241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.22247: variable 'omit' from source: magic vars 34052 1727204417.23006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204417.23287: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204417.23363: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204417.23405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204417.23456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204417.23562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204417.23599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204417.23633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204417.23680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204417.23838: Evaluated conditional (not __network_is_ostree is defined): True 34052 1727204417.23872: variable 'omit' from source: magic vars 34052 1727204417.23907: variable 'omit' from source: magic vars 34052 1727204417.24058: variable '__ostree_booted_stat' from source: set_fact 34052 1727204417.24138: variable 'omit' from source: magic vars 34052 1727204417.24200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204417.24216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204417.24246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204417.24271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.24287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.24370: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204417.24373: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.24376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.24473: Set connection var ansible_connection to ssh 34052 1727204417.24487: Set connection var ansible_timeout to 10 34052 1727204417.24497: Set connection var ansible_pipelining to False 34052 1727204417.24504: Set connection var ansible_shell_type to sh 34052 1727204417.24516: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204417.24551: Set connection var ansible_shell_executable to /bin/sh 34052 1727204417.24575: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.24634: variable 'ansible_connection' from source: unknown 34052 1727204417.24638: variable 'ansible_module_compression' from source: unknown 34052 1727204417.24647: variable 'ansible_shell_type' from source: unknown 34052 1727204417.24649: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.24651: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.24655: variable 'ansible_pipelining' from source: unknown 34052 1727204417.24660: variable 'ansible_timeout' from source: unknown 34052 1727204417.24662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.24769: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204417.24788: variable 'omit' from source: magic vars 34052 1727204417.24797: starting attempt loop 34052 1727204417.24804: running the handler 34052 1727204417.24820: handler run complete 34052 1727204417.24851: attempt loop complete, returning result 34052 1727204417.24854: _execute() done 34052 1727204417.24862: dumping result to json 34052 1727204417.24864: done dumping result, returning 34052 1727204417.24984: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [127b8e07-fff9-66a4-e2a3-0000000000cd] 34052 1727204417.24987: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cd 34052 1727204417.25067: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cd 34052 1727204417.25071: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34052 1727204417.25161: no more pending results, returning what we have 34052 1727204417.25164: results queue empty 34052 1727204417.25168: checking for any_errors_fatal 34052 1727204417.25176: done checking for any_errors_fatal 34052 1727204417.25177: checking for max_fail_percentage 34052 1727204417.25178: done checking for max_fail_percentage 34052 1727204417.25179: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.25181: done checking to see if all hosts have failed 34052 1727204417.25182: getting the remaining hosts for this loop 34052 1727204417.25183: done getting the remaining hosts for this loop 34052 1727204417.25188: getting the next task for host managed-node1 34052 1727204417.25196: done getting next task for host managed-node1 34052 1727204417.25198: ^ task is: TASK: Fix CentOS6 Base repo 34052 1727204417.25201: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.25206: getting variables 34052 1727204417.25208: in VariableManager get_vars() 34052 1727204417.25240: Calling all_inventory to load vars for managed-node1 34052 1727204417.25244: Calling groups_inventory to load vars for managed-node1 34052 1727204417.25248: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.25262: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.25371: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.25404: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.26184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.26558: done with get_vars() 34052 1727204417.26594: done getting variables 34052 1727204417.26736: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.055) 0:00:03.584 ***** 34052 1727204417.26768: entering _queue_task() for managed-node1/copy 34052 1727204417.27092: worker is 1 (out of 1 available) 34052 1727204417.27107: exiting _queue_task() for managed-node1/copy 34052 1727204417.27121: done queuing things up, now waiting for results queue to drain 34052 1727204417.27123: waiting for pending results... 34052 1727204417.27484: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 34052 1727204417.27494: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000cf 34052 1727204417.27498: variable 'ansible_search_path' from source: unknown 34052 1727204417.27501: variable 'ansible_search_path' from source: unknown 34052 1727204417.27630: calling self._execute() 34052 1727204417.27633: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.27636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.27639: variable 'omit' from source: magic vars 34052 1727204417.28150: variable 'ansible_distribution' from source: facts 34052 1727204417.28181: Evaluated conditional (ansible_distribution == 'CentOS'): False 34052 1727204417.28184: when evaluation is False, skipping this task 34052 1727204417.28187: _execute() done 34052 1727204417.28189: dumping result to json 34052 1727204417.28192: done dumping result, returning 34052 1727204417.28197: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [127b8e07-fff9-66a4-e2a3-0000000000cf] 34052 1727204417.28202: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cf 34052 1727204417.28430: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000cf 34052 1727204417.28433: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 34052 1727204417.28568: no more pending results, returning what we have 34052 1727204417.28572: results queue empty 34052 1727204417.28573: checking for any_errors_fatal 34052 1727204417.28578: done checking for any_errors_fatal 34052 1727204417.28578: checking for max_fail_percentage 34052 1727204417.28580: done checking for max_fail_percentage 34052 1727204417.28581: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.28582: done checking to see if all hosts have failed 34052 1727204417.28583: getting the remaining hosts for this loop 34052 1727204417.28584: done getting the remaining hosts for this loop 34052 1727204417.28588: getting the next task for host managed-node1 34052 1727204417.28593: done getting next task for host managed-node1 34052 1727204417.28596: ^ task is: TASK: Include the task 'enable_epel.yml' 34052 1727204417.28599: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.28603: getting variables 34052 1727204417.28604: in VariableManager get_vars() 34052 1727204417.28634: Calling all_inventory to load vars for managed-node1 34052 1727204417.28636: Calling groups_inventory to load vars for managed-node1 34052 1727204417.28640: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.28651: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.28654: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.28657: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.29475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.29955: done with get_vars() 34052 1727204417.29968: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.032) 0:00:03.617 ***** 34052 1727204417.30273: entering _queue_task() for managed-node1/include_tasks 34052 1727204417.30810: worker is 1 (out of 1 available) 34052 1727204417.30824: exiting _queue_task() for managed-node1/include_tasks 34052 1727204417.30840: done queuing things up, now waiting for results queue to drain 34052 1727204417.30842: waiting for pending results... 34052 1727204417.31383: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 34052 1727204417.31434: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000d0 34052 1727204417.31530: variable 'ansible_search_path' from source: unknown 34052 1727204417.31539: variable 'ansible_search_path' from source: unknown 34052 1727204417.31586: calling self._execute() 34052 1727204417.31760: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.31970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.31974: variable 'omit' from source: magic vars 34052 1727204417.32959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204417.38364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204417.38371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204417.38510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204417.38555: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204417.38604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204417.38769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204417.38910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204417.38941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204417.39054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204417.39105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204417.39398: variable '__network_is_ostree' from source: set_fact 34052 1727204417.39422: Evaluated conditional (not __network_is_ostree | d(false)): True 34052 1727204417.39434: _execute() done 34052 1727204417.39591: dumping result to json 34052 1727204417.39596: done dumping result, returning 34052 1727204417.39599: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-66a4-e2a3-0000000000d0] 34052 1727204417.39601: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000d0 34052 1727204417.39686: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000d0 34052 1727204417.39690: WORKER PROCESS EXITING 34052 1727204417.39719: no more pending results, returning what we have 34052 1727204417.39728: in VariableManager get_vars() 34052 1727204417.39768: Calling all_inventory to load vars for managed-node1 34052 1727204417.39771: Calling groups_inventory to load vars for managed-node1 34052 1727204417.39775: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.39788: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.39792: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.39796: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.40192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.40468: done with get_vars() 34052 1727204417.40478: variable 'ansible_search_path' from source: unknown 34052 1727204417.40624: variable 'ansible_search_path' from source: unknown 34052 1727204417.40672: we have included files to process 34052 1727204417.40673: generating all_blocks data 34052 1727204417.40675: done generating all_blocks data 34052 1727204417.40687: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34052 1727204417.40689: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34052 1727204417.40692: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34052 1727204417.41584: done processing included file 34052 1727204417.41587: iterating over new_blocks loaded from include file 34052 1727204417.41589: in VariableManager get_vars() 34052 1727204417.41603: done with get_vars() 34052 1727204417.41604: filtering new block on tags 34052 1727204417.41630: done filtering new block on tags 34052 1727204417.41633: in VariableManager get_vars() 34052 1727204417.41646: done with get_vars() 34052 1727204417.41647: filtering new block on tags 34052 1727204417.41665: done filtering new block on tags 34052 1727204417.41667: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 34052 1727204417.41679: extending task lists for all hosts with included blocks 34052 1727204417.41806: done extending task lists 34052 1727204417.41808: done processing included files 34052 1727204417.41809: results queue empty 34052 1727204417.41810: checking for any_errors_fatal 34052 1727204417.41813: done checking for any_errors_fatal 34052 1727204417.41814: checking for max_fail_percentage 34052 1727204417.41815: done checking for max_fail_percentage 34052 1727204417.41816: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.41817: done checking to see if all hosts have failed 34052 1727204417.41817: getting the remaining hosts for this loop 34052 1727204417.41819: done getting the remaining hosts for this loop 34052 1727204417.41821: getting the next task for host managed-node1 34052 1727204417.41825: done getting next task for host managed-node1 34052 1727204417.41827: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34052 1727204417.41830: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.41832: getting variables 34052 1727204417.41833: in VariableManager get_vars() 34052 1727204417.41842: Calling all_inventory to load vars for managed-node1 34052 1727204417.41844: Calling groups_inventory to load vars for managed-node1 34052 1727204417.41847: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.41852: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.41860: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.41863: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.42086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.42339: done with get_vars() 34052 1727204417.42349: done getting variables 34052 1727204417.42439: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34052 1727204417.42678: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.126) 0:00:03.744 ***** 34052 1727204417.42729: entering _queue_task() for managed-node1/command 34052 1727204417.42731: Creating lock for command 34052 1727204417.43159: worker is 1 (out of 1 available) 34052 1727204417.43318: exiting _queue_task() for managed-node1/command 34052 1727204417.43331: done queuing things up, now waiting for results queue to drain 34052 1727204417.43333: waiting for pending results... 34052 1727204417.43686: running TaskExecutor() for managed-node1/TASK: Create EPEL 40 34052 1727204417.43737: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000ea 34052 1727204417.43741: variable 'ansible_search_path' from source: unknown 34052 1727204417.43744: variable 'ansible_search_path' from source: unknown 34052 1727204417.43789: calling self._execute() 34052 1727204417.43892: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.43896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.43954: variable 'omit' from source: magic vars 34052 1727204417.44410: variable 'ansible_distribution' from source: facts 34052 1727204417.44430: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34052 1727204417.44547: when evaluation is False, skipping this task 34052 1727204417.44550: _execute() done 34052 1727204417.44553: dumping result to json 34052 1727204417.44555: done dumping result, returning 34052 1727204417.44558: done running TaskExecutor() for managed-node1/TASK: Create EPEL 40 [127b8e07-fff9-66a4-e2a3-0000000000ea] 34052 1727204417.44560: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ea 34052 1727204417.44975: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ea 34052 1727204417.44978: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34052 1727204417.45035: no more pending results, returning what we have 34052 1727204417.45046: results queue empty 34052 1727204417.45048: checking for any_errors_fatal 34052 1727204417.45049: done checking for any_errors_fatal 34052 1727204417.45050: checking for max_fail_percentage 34052 1727204417.45051: done checking for max_fail_percentage 34052 1727204417.45052: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.45053: done checking to see if all hosts have failed 34052 1727204417.45054: getting the remaining hosts for this loop 34052 1727204417.45056: done getting the remaining hosts for this loop 34052 1727204417.45060: getting the next task for host managed-node1 34052 1727204417.45069: done getting next task for host managed-node1 34052 1727204417.45072: ^ task is: TASK: Install yum-utils package 34052 1727204417.45078: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.45083: getting variables 34052 1727204417.45085: in VariableManager get_vars() 34052 1727204417.45120: Calling all_inventory to load vars for managed-node1 34052 1727204417.45123: Calling groups_inventory to load vars for managed-node1 34052 1727204417.45127: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.45142: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.45369: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.45378: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.45822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.46422: done with get_vars() 34052 1727204417.46434: done getting variables 34052 1727204417.46597: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.038) 0:00:03.783 ***** 34052 1727204417.46630: entering _queue_task() for managed-node1/package 34052 1727204417.46631: Creating lock for package 34052 1727204417.47004: worker is 1 (out of 1 available) 34052 1727204417.47024: exiting _queue_task() for managed-node1/package 34052 1727204417.47041: done queuing things up, now waiting for results queue to drain 34052 1727204417.47043: waiting for pending results... 34052 1727204417.47307: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 34052 1727204417.47452: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000eb 34052 1727204417.47457: variable 'ansible_search_path' from source: unknown 34052 1727204417.47558: variable 'ansible_search_path' from source: unknown 34052 1727204417.47563: calling self._execute() 34052 1727204417.47616: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.47628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.47644: variable 'omit' from source: magic vars 34052 1727204417.48128: variable 'ansible_distribution' from source: facts 34052 1727204417.48205: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34052 1727204417.48209: when evaluation is False, skipping this task 34052 1727204417.48211: _execute() done 34052 1727204417.48228: dumping result to json 34052 1727204417.48232: done dumping result, returning 34052 1727204417.48235: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [127b8e07-fff9-66a4-e2a3-0000000000eb] 34052 1727204417.48237: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000eb skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34052 1727204417.48742: no more pending results, returning what we have 34052 1727204417.48747: results queue empty 34052 1727204417.48747: checking for any_errors_fatal 34052 1727204417.48754: done checking for any_errors_fatal 34052 1727204417.48762: checking for max_fail_percentage 34052 1727204417.48764: done checking for max_fail_percentage 34052 1727204417.48765: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.48768: done checking to see if all hosts have failed 34052 1727204417.48768: getting the remaining hosts for this loop 34052 1727204417.48770: done getting the remaining hosts for this loop 34052 1727204417.48775: getting the next task for host managed-node1 34052 1727204417.48782: done getting next task for host managed-node1 34052 1727204417.48785: ^ task is: TASK: Enable EPEL 7 34052 1727204417.48791: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.48795: getting variables 34052 1727204417.48797: in VariableManager get_vars() 34052 1727204417.48835: Calling all_inventory to load vars for managed-node1 34052 1727204417.48838: Calling groups_inventory to load vars for managed-node1 34052 1727204417.48843: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.48972: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.48981: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.48986: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.49249: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000eb 34052 1727204417.49252: WORKER PROCESS EXITING 34052 1727204417.49278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.49533: done with get_vars() 34052 1727204417.49545: done getting variables 34052 1727204417.49617: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.030) 0:00:03.813 ***** 34052 1727204417.49655: entering _queue_task() for managed-node1/command 34052 1727204417.50013: worker is 1 (out of 1 available) 34052 1727204417.50026: exiting _queue_task() for managed-node1/command 34052 1727204417.50039: done queuing things up, now waiting for results queue to drain 34052 1727204417.50041: waiting for pending results... 34052 1727204417.50416: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 34052 1727204417.50562: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000ec 34052 1727204417.50586: variable 'ansible_search_path' from source: unknown 34052 1727204417.50601: variable 'ansible_search_path' from source: unknown 34052 1727204417.50650: calling self._execute() 34052 1727204417.50749: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.50762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.50780: variable 'omit' from source: magic vars 34052 1727204417.51228: variable 'ansible_distribution' from source: facts 34052 1727204417.51246: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34052 1727204417.51261: when evaluation is False, skipping this task 34052 1727204417.51270: _execute() done 34052 1727204417.51279: dumping result to json 34052 1727204417.51285: done dumping result, returning 34052 1727204417.51294: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [127b8e07-fff9-66a4-e2a3-0000000000ec] 34052 1727204417.51300: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ec 34052 1727204417.51513: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ec 34052 1727204417.51515: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34052 1727204417.51571: no more pending results, returning what we have 34052 1727204417.51574: results queue empty 34052 1727204417.51575: checking for any_errors_fatal 34052 1727204417.51588: done checking for any_errors_fatal 34052 1727204417.51589: checking for max_fail_percentage 34052 1727204417.51590: done checking for max_fail_percentage 34052 1727204417.51591: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.51592: done checking to see if all hosts have failed 34052 1727204417.51593: getting the remaining hosts for this loop 34052 1727204417.51594: done getting the remaining hosts for this loop 34052 1727204417.51599: getting the next task for host managed-node1 34052 1727204417.51607: done getting next task for host managed-node1 34052 1727204417.51610: ^ task is: TASK: Enable EPEL 8 34052 1727204417.51616: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.51622: getting variables 34052 1727204417.51690: in VariableManager get_vars() 34052 1727204417.51725: Calling all_inventory to load vars for managed-node1 34052 1727204417.51727: Calling groups_inventory to load vars for managed-node1 34052 1727204417.51731: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.51855: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.51859: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.51862: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.52126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.52406: done with get_vars() 34052 1727204417.52455: done getting variables 34052 1727204417.52537: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.029) 0:00:03.842 ***** 34052 1727204417.52572: entering _queue_task() for managed-node1/command 34052 1727204417.53091: worker is 1 (out of 1 available) 34052 1727204417.53104: exiting _queue_task() for managed-node1/command 34052 1727204417.53117: done queuing things up, now waiting for results queue to drain 34052 1727204417.53119: waiting for pending results... 34052 1727204417.53358: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 34052 1727204417.53741: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000ed 34052 1727204417.53745: variable 'ansible_search_path' from source: unknown 34052 1727204417.53747: variable 'ansible_search_path' from source: unknown 34052 1727204417.53750: calling self._execute() 34052 1727204417.53753: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.53755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.53764: variable 'omit' from source: magic vars 34052 1727204417.54214: variable 'ansible_distribution' from source: facts 34052 1727204417.54233: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34052 1727204417.54241: when evaluation is False, skipping this task 34052 1727204417.54249: _execute() done 34052 1727204417.54255: dumping result to json 34052 1727204417.54262: done dumping result, returning 34052 1727204417.54275: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [127b8e07-fff9-66a4-e2a3-0000000000ed] 34052 1727204417.54295: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ed skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34052 1727204417.54539: no more pending results, returning what we have 34052 1727204417.54544: results queue empty 34052 1727204417.54545: checking for any_errors_fatal 34052 1727204417.54550: done checking for any_errors_fatal 34052 1727204417.54550: checking for max_fail_percentage 34052 1727204417.54552: done checking for max_fail_percentage 34052 1727204417.54553: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.54554: done checking to see if all hosts have failed 34052 1727204417.54554: getting the remaining hosts for this loop 34052 1727204417.54556: done getting the remaining hosts for this loop 34052 1727204417.54560: getting the next task for host managed-node1 34052 1727204417.54571: done getting next task for host managed-node1 34052 1727204417.54574: ^ task is: TASK: Enable EPEL 6 34052 1727204417.54580: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.54585: getting variables 34052 1727204417.54587: in VariableManager get_vars() 34052 1727204417.54621: Calling all_inventory to load vars for managed-node1 34052 1727204417.54624: Calling groups_inventory to load vars for managed-node1 34052 1727204417.54629: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.54643: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.54646: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.54650: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.55086: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ed 34052 1727204417.55089: WORKER PROCESS EXITING 34052 1727204417.55120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.55452: done with get_vars() 34052 1727204417.55467: done getting variables 34052 1727204417.55532: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.030) 0:00:03.873 ***** 34052 1727204417.55588: entering _queue_task() for managed-node1/copy 34052 1727204417.56038: worker is 1 (out of 1 available) 34052 1727204417.56051: exiting _queue_task() for managed-node1/copy 34052 1727204417.56061: done queuing things up, now waiting for results queue to drain 34052 1727204417.56063: waiting for pending results... 34052 1727204417.56249: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 34052 1727204417.56378: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000ef 34052 1727204417.56403: variable 'ansible_search_path' from source: unknown 34052 1727204417.56416: variable 'ansible_search_path' from source: unknown 34052 1727204417.56459: calling self._execute() 34052 1727204417.56555: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.56573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.56588: variable 'omit' from source: magic vars 34052 1727204417.57775: variable 'ansible_distribution' from source: facts 34052 1727204417.57885: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34052 1727204417.57889: when evaluation is False, skipping this task 34052 1727204417.57892: _execute() done 34052 1727204417.57895: dumping result to json 34052 1727204417.57897: done dumping result, returning 34052 1727204417.57899: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [127b8e07-fff9-66a4-e2a3-0000000000ef] 34052 1727204417.57902: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ef skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34052 1727204417.58350: no more pending results, returning what we have 34052 1727204417.58354: results queue empty 34052 1727204417.58355: checking for any_errors_fatal 34052 1727204417.58359: done checking for any_errors_fatal 34052 1727204417.58360: checking for max_fail_percentage 34052 1727204417.58362: done checking for max_fail_percentage 34052 1727204417.58362: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.58363: done checking to see if all hosts have failed 34052 1727204417.58364: getting the remaining hosts for this loop 34052 1727204417.58368: done getting the remaining hosts for this loop 34052 1727204417.58372: getting the next task for host managed-node1 34052 1727204417.58382: done getting next task for host managed-node1 34052 1727204417.58385: ^ task is: TASK: Set network provider to 'nm' 34052 1727204417.58388: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.58393: getting variables 34052 1727204417.58395: in VariableManager get_vars() 34052 1727204417.58434: Calling all_inventory to load vars for managed-node1 34052 1727204417.58437: Calling groups_inventory to load vars for managed-node1 34052 1727204417.58442: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.58460: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.58463: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.58776: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.59426: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000ef 34052 1727204417.59430: WORKER PROCESS EXITING 34052 1727204417.59464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.59749: done with get_vars() 34052 1727204417.59760: done getting variables 34052 1727204417.59825: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.042) 0:00:03.915 ***** 34052 1727204417.59858: entering _queue_task() for managed-node1/set_fact 34052 1727204417.60573: worker is 1 (out of 1 available) 34052 1727204417.60585: exiting _queue_task() for managed-node1/set_fact 34052 1727204417.60603: done queuing things up, now waiting for results queue to drain 34052 1727204417.60606: waiting for pending results... 34052 1727204417.61011: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 34052 1727204417.61181: in run() - task 127b8e07-fff9-66a4-e2a3-000000000007 34052 1727204417.61213: variable 'ansible_search_path' from source: unknown 34052 1727204417.61261: calling self._execute() 34052 1727204417.61358: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.61374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.61388: variable 'omit' from source: magic vars 34052 1727204417.61508: variable 'omit' from source: magic vars 34052 1727204417.61549: variable 'omit' from source: magic vars 34052 1727204417.61598: variable 'omit' from source: magic vars 34052 1727204417.61652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204417.61705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204417.61737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204417.61970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.61974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.61977: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204417.61979: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.61981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.61983: Set connection var ansible_connection to ssh 34052 1727204417.61986: Set connection var ansible_timeout to 10 34052 1727204417.61988: Set connection var ansible_pipelining to False 34052 1727204417.61990: Set connection var ansible_shell_type to sh 34052 1727204417.61992: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204417.61994: Set connection var ansible_shell_executable to /bin/sh 34052 1727204417.62021: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.62032: variable 'ansible_connection' from source: unknown 34052 1727204417.62039: variable 'ansible_module_compression' from source: unknown 34052 1727204417.62046: variable 'ansible_shell_type' from source: unknown 34052 1727204417.62052: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.62060: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.62072: variable 'ansible_pipelining' from source: unknown 34052 1727204417.62079: variable 'ansible_timeout' from source: unknown 34052 1727204417.62088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.62259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204417.62281: variable 'omit' from source: magic vars 34052 1727204417.62291: starting attempt loop 34052 1727204417.62297: running the handler 34052 1727204417.62314: handler run complete 34052 1727204417.62336: attempt loop complete, returning result 34052 1727204417.62343: _execute() done 34052 1727204417.62349: dumping result to json 34052 1727204417.62356: done dumping result, returning 34052 1727204417.62370: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [127b8e07-fff9-66a4-e2a3-000000000007] 34052 1727204417.62379: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000007 34052 1727204417.62571: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000007 34052 1727204417.62575: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34052 1727204417.62641: no more pending results, returning what we have 34052 1727204417.62644: results queue empty 34052 1727204417.62644: checking for any_errors_fatal 34052 1727204417.62653: done checking for any_errors_fatal 34052 1727204417.62654: checking for max_fail_percentage 34052 1727204417.62656: done checking for max_fail_percentage 34052 1727204417.62657: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.62658: done checking to see if all hosts have failed 34052 1727204417.62658: getting the remaining hosts for this loop 34052 1727204417.62660: done getting the remaining hosts for this loop 34052 1727204417.62665: getting the next task for host managed-node1 34052 1727204417.62674: done getting next task for host managed-node1 34052 1727204417.62677: ^ task is: TASK: meta (flush_handlers) 34052 1727204417.62679: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.62683: getting variables 34052 1727204417.62685: in VariableManager get_vars() 34052 1727204417.62719: Calling all_inventory to load vars for managed-node1 34052 1727204417.62722: Calling groups_inventory to load vars for managed-node1 34052 1727204417.62728: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.62742: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.62746: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.62749: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.63270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.63819: done with get_vars() 34052 1727204417.63836: done getting variables 34052 1727204417.63913: in VariableManager get_vars() 34052 1727204417.63927: Calling all_inventory to load vars for managed-node1 34052 1727204417.63930: Calling groups_inventory to load vars for managed-node1 34052 1727204417.63933: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.63938: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.63941: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.63944: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.64354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.64761: done with get_vars() 34052 1727204417.64911: done queuing things up, now waiting for results queue to drain 34052 1727204417.64914: results queue empty 34052 1727204417.64915: checking for any_errors_fatal 34052 1727204417.64918: done checking for any_errors_fatal 34052 1727204417.64919: checking for max_fail_percentage 34052 1727204417.64920: done checking for max_fail_percentage 34052 1727204417.64921: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.64922: done checking to see if all hosts have failed 34052 1727204417.64922: getting the remaining hosts for this loop 34052 1727204417.64923: done getting the remaining hosts for this loop 34052 1727204417.64929: getting the next task for host managed-node1 34052 1727204417.64951: done getting next task for host managed-node1 34052 1727204417.64953: ^ task is: TASK: meta (flush_handlers) 34052 1727204417.64960: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.64972: getting variables 34052 1727204417.64973: in VariableManager get_vars() 34052 1727204417.64989: Calling all_inventory to load vars for managed-node1 34052 1727204417.64991: Calling groups_inventory to load vars for managed-node1 34052 1727204417.64994: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.65000: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.65003: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.65006: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.65193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.65407: done with get_vars() 34052 1727204417.65417: done getting variables 34052 1727204417.65473: in VariableManager get_vars() 34052 1727204417.65484: Calling all_inventory to load vars for managed-node1 34052 1727204417.65486: Calling groups_inventory to load vars for managed-node1 34052 1727204417.65489: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.65494: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.65496: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.65499: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.65660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.65944: done with get_vars() 34052 1727204417.65960: done queuing things up, now waiting for results queue to drain 34052 1727204417.65963: results queue empty 34052 1727204417.65963: checking for any_errors_fatal 34052 1727204417.65967: done checking for any_errors_fatal 34052 1727204417.65968: checking for max_fail_percentage 34052 1727204417.65969: done checking for max_fail_percentage 34052 1727204417.65970: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.65971: done checking to see if all hosts have failed 34052 1727204417.65972: getting the remaining hosts for this loop 34052 1727204417.65973: done getting the remaining hosts for this loop 34052 1727204417.65976: getting the next task for host managed-node1 34052 1727204417.65980: done getting next task for host managed-node1 34052 1727204417.65981: ^ task is: None 34052 1727204417.65983: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.65985: done queuing things up, now waiting for results queue to drain 34052 1727204417.65986: results queue empty 34052 1727204417.65986: checking for any_errors_fatal 34052 1727204417.65987: done checking for any_errors_fatal 34052 1727204417.65988: checking for max_fail_percentage 34052 1727204417.65989: done checking for max_fail_percentage 34052 1727204417.65990: checking to see if all hosts have failed and the running result is not ok 34052 1727204417.65991: done checking to see if all hosts have failed 34052 1727204417.65993: getting the next task for host managed-node1 34052 1727204417.65995: done getting next task for host managed-node1 34052 1727204417.65996: ^ task is: None 34052 1727204417.65998: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.66075: in VariableManager get_vars() 34052 1727204417.66105: done with get_vars() 34052 1727204417.66117: in VariableManager get_vars() 34052 1727204417.66135: done with get_vars() 34052 1727204417.66142: variable 'omit' from source: magic vars 34052 1727204417.66184: in VariableManager get_vars() 34052 1727204417.66201: done with get_vars() 34052 1727204417.66232: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 34052 1727204417.66710: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34052 1727204417.66742: getting the remaining hosts for this loop 34052 1727204417.66744: done getting the remaining hosts for this loop 34052 1727204417.66747: getting the next task for host managed-node1 34052 1727204417.66751: done getting next task for host managed-node1 34052 1727204417.66753: ^ task is: TASK: Gathering Facts 34052 1727204417.66755: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204417.66757: getting variables 34052 1727204417.66758: in VariableManager get_vars() 34052 1727204417.66777: Calling all_inventory to load vars for managed-node1 34052 1727204417.66779: Calling groups_inventory to load vars for managed-node1 34052 1727204417.66781: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204417.66788: Calling all_plugins_play to load vars for managed-node1 34052 1727204417.66805: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204417.66809: Calling groups_plugins_play to load vars for managed-node1 34052 1727204417.67004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204417.67377: done with get_vars() 34052 1727204417.67389: done getting variables 34052 1727204417.67442: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Tuesday 24 September 2024 15:00:17 -0400 (0:00:00.076) 0:00:03.991 ***** 34052 1727204417.67469: entering _queue_task() for managed-node1/gather_facts 34052 1727204417.67801: worker is 1 (out of 1 available) 34052 1727204417.67813: exiting _queue_task() for managed-node1/gather_facts 34052 1727204417.67829: done queuing things up, now waiting for results queue to drain 34052 1727204417.67831: waiting for pending results... 34052 1727204417.68283: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34052 1727204417.68288: in run() - task 127b8e07-fff9-66a4-e2a3-000000000115 34052 1727204417.68292: variable 'ansible_search_path' from source: unknown 34052 1727204417.68294: calling self._execute() 34052 1727204417.68369: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.68382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.68396: variable 'omit' from source: magic vars 34052 1727204417.68929: variable 'ansible_distribution_major_version' from source: facts 34052 1727204417.68953: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204417.68966: variable 'omit' from source: magic vars 34052 1727204417.68999: variable 'omit' from source: magic vars 34052 1727204417.69045: variable 'omit' from source: magic vars 34052 1727204417.69098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204417.69149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204417.69184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204417.69208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.69228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204417.69262: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204417.69275: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.69283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.69401: Set connection var ansible_connection to ssh 34052 1727204417.69491: Set connection var ansible_timeout to 10 34052 1727204417.69495: Set connection var ansible_pipelining to False 34052 1727204417.69498: Set connection var ansible_shell_type to sh 34052 1727204417.69501: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204417.69503: Set connection var ansible_shell_executable to /bin/sh 34052 1727204417.69505: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.69508: variable 'ansible_connection' from source: unknown 34052 1727204417.69510: variable 'ansible_module_compression' from source: unknown 34052 1727204417.69513: variable 'ansible_shell_type' from source: unknown 34052 1727204417.69515: variable 'ansible_shell_executable' from source: unknown 34052 1727204417.69520: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204417.69533: variable 'ansible_pipelining' from source: unknown 34052 1727204417.69540: variable 'ansible_timeout' from source: unknown 34052 1727204417.69547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204417.69761: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204417.69781: variable 'omit' from source: magic vars 34052 1727204417.69791: starting attempt loop 34052 1727204417.69797: running the handler 34052 1727204417.69821: variable 'ansible_facts' from source: unknown 34052 1727204417.69848: _low_level_execute_command(): starting 34052 1727204417.69871: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204417.70780: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204417.70817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204417.70830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.70929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204417.73522: stdout chunk (state=3): >>>/root <<< 34052 1727204417.73845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204417.73884: stderr chunk (state=3): >>><<< 34052 1727204417.73888: stdout chunk (state=3): >>><<< 34052 1727204417.73965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204417.73973: _low_level_execute_command(): starting 34052 1727204417.73977: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641 `" && echo ansible-tmp-1727204417.7391229-34300-238323255959641="` echo /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641 `" ) && sleep 0' 34052 1727204417.75505: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204417.75543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204417.75563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.75670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204417.78488: stdout chunk (state=3): >>>ansible-tmp-1727204417.7391229-34300-238323255959641=/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641 <<< 34052 1727204417.78767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204417.78771: stdout chunk (state=3): >>><<< 34052 1727204417.78774: stderr chunk (state=3): >>><<< 34052 1727204417.78793: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204417.7391229-34300-238323255959641=/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204417.78832: variable 'ansible_module_compression' from source: unknown 34052 1727204417.78909: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34052 1727204417.78991: variable 'ansible_facts' from source: unknown 34052 1727204417.79214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py 34052 1727204417.79522: Sending initial data 34052 1727204417.79525: Sent initial data (154 bytes) 34052 1727204417.80177: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204417.80202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.80297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204417.82818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204417.82879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204417.82943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpanyu3jj1 /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py <<< 34052 1727204417.82949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py" <<< 34052 1727204417.83011: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpanyu3jj1" to remote "/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py" <<< 34052 1727204417.84812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204417.84879: stderr chunk (state=3): >>><<< 34052 1727204417.84895: stdout chunk (state=3): >>><<< 34052 1727204417.84941: done transferring module to remote 34052 1727204417.84959: _low_level_execute_command(): starting 34052 1727204417.85022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/ /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py && sleep 0' 34052 1727204417.86619: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204417.86623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204417.86692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.86797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204417.89734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204417.89755: stderr chunk (state=3): >>><<< 34052 1727204417.89768: stdout chunk (state=3): >>><<< 34052 1727204417.89832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204417.89878: _low_level_execute_command(): starting 34052 1727204417.89890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/AnsiballZ_setup.py && sleep 0' 34052 1727204417.91322: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204417.91345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204417.91361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204417.91385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204417.91489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204417.91609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204417.91677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204418.83065: stdout chunk (state=3): >>> <<< 34052 1727204418.83074: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3018, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 698, "free": 3018}, "nocache": {"free": 3457, "used": 259}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_bo<<< 34052 1727204418.83098: stdout chunk (state=3): >>>ard_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_uuid": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "<<< 34052 1727204418.83335: stdout chunk (state=3): >>>size_available": 251314790400, "block_size": 4096, "block_total": 64479564, "block_available": 61356150, "block_used": 3123414, "inode_total": 16384000, "inode_available": 16301494, "inode_used": 82506, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "18", "epoch": "1727204418", "epoch_int": "1727204418", "date": "2024-09-24", "time": "15:00:18", "iso8601_micro": "2024-09-24T19:00:18.765579Z", "iso8601": "2024-09-24T19:00:18Z", "iso8601_basic": "20240924T150018765579", "iso8601_basic_short": "20240924T150018", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.48388671875, "15m": 0.2998046875}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10d0:dfff:fe0f:c94d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.176"], "ansible_all_ipv6_addresses": ["fe80::10d0:dfff:fe0f:c94d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.8.176", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10d0:dfff:fe0f:c94d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34052 1727204418.86301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204418.86315: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204418.86368: stderr chunk (state=3): >>><<< 34052 1727204418.86377: stdout chunk (state=3): >>><<< 34052 1727204418.86423: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26514e308c194cfcd8a9c892de18dd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3018, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 698, "free": 3018}, "nocache": {"free": 3457, "used": 259}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_uuid": "ec26514e-308c-194c-fcd8-a9c892de18dd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251314790400, "block_size": 4096, "block_total": 64479564, "block_available": 61356150, "block_used": 3123414, "inode_total": 16384000, "inode_available": 16301494, "inode_used": 82506, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDLk70qUDdIlMGefmY9CSzrAInUx7bdf89EgNGwy+627RdK/JZ6JwRBpph6RT/Xj1n4IlrnVUjUiUaorYlNqj24r7gfLUKrzB4vu8pIjwI6ge8+qjGGZDnQm+SKJK65ECm944hk7VFOi1xZWQJNYN9xVACr/ifxYeQOLNjmwajWGL4iKhiO4shsjmafF13uWUiv8C8TB9VoiAf+UJPc5DUojGJ0pjF2P/VkLEYMGRslXiQJ+GH1QxrlNZZrQY5v5Xfsd7i7l5F01JvvOvVJHkZOt/vBCvIhn7TxIdIa+95vg9XsSUTY9S0avSZv95Ua/hGHIxgLE5CNJIQUdwfJnNi0gPblQGjNj3TVx+VqgLzOjFTfD8EIkJFmC/DMhm0bCDgdclIMmmhdkJDQ6ApjJcbRElBMa+IwZZd+l+qfD/DWcsigb7wftf43WI+Y74+SRpYtLmq0h3XeubKMqvxdqOIm05stM4OxvJgopHVPTepTczripmjJ0lbfD8TkdY3NYw8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFS01F5Tk75zUFCO6hP1eZVzOfFBOUa1U6ePV4u7EOwcevlrKoP/8LVaMLToSYNDptDQpZQIlpx02mv3wOPx14c=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEbqXmW7LS2UP4fmMBI/TP3Wh1Hqq5KAj8b9n0HP0o8r", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "18", "epoch": "1727204418", "epoch_int": "1727204418", "date": "2024-09-24", "time": "15:00:18", "iso8601_micro": "2024-09-24T19:00:18.765579Z", "iso8601": "2024-09-24T19:00:18Z", "iso8601_basic": "20240924T150018765579", "iso8601_basic_short": "20240924T150018", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 37226 10.31.8.176 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 37226 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.48388671875, "15m": 0.2998046875}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10d0:dfff:fe0f:c94d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.8.176", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d0:df:0f:c9:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.8.176"], "ansible_all_ipv6_addresses": ["fe80::10d0:dfff:fe0f:c94d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.8.176", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10d0:dfff:fe0f:c94d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204418.86667: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204418.86687: _low_level_execute_command(): starting 34052 1727204418.86691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204417.7391229-34300-238323255959641/ > /dev/null 2>&1 && sleep 0' 34052 1727204418.87573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204418.87580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204418.87584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204418.87586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204418.87641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204418.90481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204418.90485: stdout chunk (state=3): >>><<< 34052 1727204418.90487: stderr chunk (state=3): >>><<< 34052 1727204418.90490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204418.90518: handler run complete 34052 1727204418.90717: variable 'ansible_facts' from source: unknown 34052 1727204418.90889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.91377: variable 'ansible_facts' from source: unknown 34052 1727204418.91496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.91662: attempt loop complete, returning result 34052 1727204418.91677: _execute() done 34052 1727204418.91689: dumping result to json 34052 1727204418.91719: done dumping result, returning 34052 1727204418.91729: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [127b8e07-fff9-66a4-e2a3-000000000115] 34052 1727204418.91732: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000115 ok: [managed-node1] 34052 1727204418.92313: no more pending results, returning what we have 34052 1727204418.92316: results queue empty 34052 1727204418.92316: checking for any_errors_fatal 34052 1727204418.92317: done checking for any_errors_fatal 34052 1727204418.92318: checking for max_fail_percentage 34052 1727204418.92318: done checking for max_fail_percentage 34052 1727204418.92319: checking to see if all hosts have failed and the running result is not ok 34052 1727204418.92319: done checking to see if all hosts have failed 34052 1727204418.92320: getting the remaining hosts for this loop 34052 1727204418.92321: done getting the remaining hosts for this loop 34052 1727204418.92323: getting the next task for host managed-node1 34052 1727204418.92328: done getting next task for host managed-node1 34052 1727204418.92329: ^ task is: TASK: meta (flush_handlers) 34052 1727204418.92330: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204418.92333: getting variables 34052 1727204418.92334: in VariableManager get_vars() 34052 1727204418.92357: Calling all_inventory to load vars for managed-node1 34052 1727204418.92359: Calling groups_inventory to load vars for managed-node1 34052 1727204418.92361: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.92372: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.92373: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.92377: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.92490: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000115 34052 1727204418.92494: WORKER PROCESS EXITING 34052 1727204418.92507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.92644: done with get_vars() 34052 1727204418.92653: done getting variables 34052 1727204418.92706: in VariableManager get_vars() 34052 1727204418.92716: Calling all_inventory to load vars for managed-node1 34052 1727204418.92718: Calling groups_inventory to load vars for managed-node1 34052 1727204418.92720: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.92725: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.92727: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.92731: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.92827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.92979: done with get_vars() 34052 1727204418.92990: done queuing things up, now waiting for results queue to drain 34052 1727204418.92991: results queue empty 34052 1727204418.92991: checking for any_errors_fatal 34052 1727204418.92994: done checking for any_errors_fatal 34052 1727204418.92994: checking for max_fail_percentage 34052 1727204418.92995: done checking for max_fail_percentage 34052 1727204418.92996: checking to see if all hosts have failed and the running result is not ok 34052 1727204418.92996: done checking to see if all hosts have failed 34052 1727204418.93000: getting the remaining hosts for this loop 34052 1727204418.93001: done getting the remaining hosts for this loop 34052 1727204418.93003: getting the next task for host managed-node1 34052 1727204418.93006: done getting next task for host managed-node1 34052 1727204418.93007: ^ task is: TASK: Include the task 'show_interfaces.yml' 34052 1727204418.93009: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204418.93010: getting variables 34052 1727204418.93011: in VariableManager get_vars() 34052 1727204418.93020: Calling all_inventory to load vars for managed-node1 34052 1727204418.93021: Calling groups_inventory to load vars for managed-node1 34052 1727204418.93022: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.93026: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.93028: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.93030: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.93126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.93260: done with get_vars() 34052 1727204418.93269: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Tuesday 24 September 2024 15:00:18 -0400 (0:00:01.258) 0:00:05.250 ***** 34052 1727204418.93326: entering _queue_task() for managed-node1/include_tasks 34052 1727204418.93645: worker is 1 (out of 1 available) 34052 1727204418.93659: exiting _queue_task() for managed-node1/include_tasks 34052 1727204418.93675: done queuing things up, now waiting for results queue to drain 34052 1727204418.93676: waiting for pending results... 34052 1727204418.93885: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 34052 1727204418.94075: in run() - task 127b8e07-fff9-66a4-e2a3-00000000000b 34052 1727204418.94079: variable 'ansible_search_path' from source: unknown 34052 1727204418.94082: calling self._execute() 34052 1727204418.94135: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204418.94153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204418.94183: variable 'omit' from source: magic vars 34052 1727204418.94606: variable 'ansible_distribution_major_version' from source: facts 34052 1727204418.94629: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204418.94640: _execute() done 34052 1727204418.94648: dumping result to json 34052 1727204418.94656: done dumping result, returning 34052 1727204418.94668: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-66a4-e2a3-00000000000b] 34052 1727204418.94684: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000b 34052 1727204418.94815: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000b 34052 1727204418.94824: WORKER PROCESS EXITING 34052 1727204418.94860: no more pending results, returning what we have 34052 1727204418.94985: in VariableManager get_vars() 34052 1727204418.95038: Calling all_inventory to load vars for managed-node1 34052 1727204418.95041: Calling groups_inventory to load vars for managed-node1 34052 1727204418.95043: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.95058: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.95061: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.95064: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.95333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.95530: done with get_vars() 34052 1727204418.95538: variable 'ansible_search_path' from source: unknown 34052 1727204418.95554: we have included files to process 34052 1727204418.95555: generating all_blocks data 34052 1727204418.95556: done generating all_blocks data 34052 1727204418.95556: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204418.95557: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204418.95559: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204418.95706: in VariableManager get_vars() 34052 1727204418.95722: done with get_vars() 34052 1727204418.95838: done processing included file 34052 1727204418.95840: iterating over new_blocks loaded from include file 34052 1727204418.95842: in VariableManager get_vars() 34052 1727204418.95858: done with get_vars() 34052 1727204418.95860: filtering new block on tags 34052 1727204418.95879: done filtering new block on tags 34052 1727204418.95882: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 34052 1727204418.95887: extending task lists for all hosts with included blocks 34052 1727204418.95957: done extending task lists 34052 1727204418.95958: done processing included files 34052 1727204418.95959: results queue empty 34052 1727204418.95960: checking for any_errors_fatal 34052 1727204418.95962: done checking for any_errors_fatal 34052 1727204418.95962: checking for max_fail_percentage 34052 1727204418.95963: done checking for max_fail_percentage 34052 1727204418.95964: checking to see if all hosts have failed and the running result is not ok 34052 1727204418.95967: done checking to see if all hosts have failed 34052 1727204418.95968: getting the remaining hosts for this loop 34052 1727204418.95969: done getting the remaining hosts for this loop 34052 1727204418.95971: getting the next task for host managed-node1 34052 1727204418.95975: done getting next task for host managed-node1 34052 1727204418.95977: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34052 1727204418.95980: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204418.95982: getting variables 34052 1727204418.95983: in VariableManager get_vars() 34052 1727204418.95996: Calling all_inventory to load vars for managed-node1 34052 1727204418.95998: Calling groups_inventory to load vars for managed-node1 34052 1727204418.96000: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.96006: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.96009: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.96012: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.96248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.96548: done with get_vars() 34052 1727204418.96560: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:18 -0400 (0:00:00.033) 0:00:05.284 ***** 34052 1727204418.96701: entering _queue_task() for managed-node1/include_tasks 34052 1727204418.97000: worker is 1 (out of 1 available) 34052 1727204418.97015: exiting _queue_task() for managed-node1/include_tasks 34052 1727204418.97028: done queuing things up, now waiting for results queue to drain 34052 1727204418.97030: waiting for pending results... 34052 1727204418.97217: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 34052 1727204418.97292: in run() - task 127b8e07-fff9-66a4-e2a3-00000000012b 34052 1727204418.97305: variable 'ansible_search_path' from source: unknown 34052 1727204418.97308: variable 'ansible_search_path' from source: unknown 34052 1727204418.97342: calling self._execute() 34052 1727204418.97419: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204418.97423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204418.97433: variable 'omit' from source: magic vars 34052 1727204418.97743: variable 'ansible_distribution_major_version' from source: facts 34052 1727204418.97753: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204418.97759: _execute() done 34052 1727204418.97762: dumping result to json 34052 1727204418.97767: done dumping result, returning 34052 1727204418.97775: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-66a4-e2a3-00000000012b] 34052 1727204418.97778: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000012b 34052 1727204418.97880: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000012b 34052 1727204418.97883: WORKER PROCESS EXITING 34052 1727204418.97914: no more pending results, returning what we have 34052 1727204418.97920: in VariableManager get_vars() 34052 1727204418.97974: Calling all_inventory to load vars for managed-node1 34052 1727204418.97977: Calling groups_inventory to load vars for managed-node1 34052 1727204418.97979: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.97995: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.97998: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.98001: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.98171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.98321: done with get_vars() 34052 1727204418.98330: variable 'ansible_search_path' from source: unknown 34052 1727204418.98331: variable 'ansible_search_path' from source: unknown 34052 1727204418.98361: we have included files to process 34052 1727204418.98362: generating all_blocks data 34052 1727204418.98363: done generating all_blocks data 34052 1727204418.98364: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204418.98364: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204418.98369: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204418.98611: done processing included file 34052 1727204418.98613: iterating over new_blocks loaded from include file 34052 1727204418.98615: in VariableManager get_vars() 34052 1727204418.98633: done with get_vars() 34052 1727204418.98634: filtering new block on tags 34052 1727204418.98647: done filtering new block on tags 34052 1727204418.98649: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 34052 1727204418.98653: extending task lists for all hosts with included blocks 34052 1727204418.98720: done extending task lists 34052 1727204418.98721: done processing included files 34052 1727204418.98722: results queue empty 34052 1727204418.98723: checking for any_errors_fatal 34052 1727204418.98727: done checking for any_errors_fatal 34052 1727204418.98727: checking for max_fail_percentage 34052 1727204418.98729: done checking for max_fail_percentage 34052 1727204418.98729: checking to see if all hosts have failed and the running result is not ok 34052 1727204418.98730: done checking to see if all hosts have failed 34052 1727204418.98731: getting the remaining hosts for this loop 34052 1727204418.98732: done getting the remaining hosts for this loop 34052 1727204418.98734: getting the next task for host managed-node1 34052 1727204418.98737: done getting next task for host managed-node1 34052 1727204418.98739: ^ task is: TASK: Gather current interface info 34052 1727204418.98741: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204418.98743: getting variables 34052 1727204418.98743: in VariableManager get_vars() 34052 1727204418.98753: Calling all_inventory to load vars for managed-node1 34052 1727204418.98755: Calling groups_inventory to load vars for managed-node1 34052 1727204418.98756: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204418.98761: Calling all_plugins_play to load vars for managed-node1 34052 1727204418.98762: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204418.98771: Calling groups_plugins_play to load vars for managed-node1 34052 1727204418.98972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204418.99193: done with get_vars() 34052 1727204418.99206: done getting variables 34052 1727204418.99260: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:18 -0400 (0:00:00.025) 0:00:05.310 ***** 34052 1727204418.99294: entering _queue_task() for managed-node1/command 34052 1727204418.99650: worker is 1 (out of 1 available) 34052 1727204418.99668: exiting _queue_task() for managed-node1/command 34052 1727204418.99682: done queuing things up, now waiting for results queue to drain 34052 1727204418.99684: waiting for pending results... 34052 1727204419.00000: running TaskExecutor() for managed-node1/TASK: Gather current interface info 34052 1727204419.00092: in run() - task 127b8e07-fff9-66a4-e2a3-00000000013a 34052 1727204419.00104: variable 'ansible_search_path' from source: unknown 34052 1727204419.00108: variable 'ansible_search_path' from source: unknown 34052 1727204419.00143: calling self._execute() 34052 1727204419.00216: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.00221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.00233: variable 'omit' from source: magic vars 34052 1727204419.00564: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.00576: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.00582: variable 'omit' from source: magic vars 34052 1727204419.00620: variable 'omit' from source: magic vars 34052 1727204419.00649: variable 'omit' from source: magic vars 34052 1727204419.00686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204419.00722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204419.00742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204419.00756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.00768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.00793: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204419.00796: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.00799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.00880: Set connection var ansible_connection to ssh 34052 1727204419.00888: Set connection var ansible_timeout to 10 34052 1727204419.00894: Set connection var ansible_pipelining to False 34052 1727204419.00897: Set connection var ansible_shell_type to sh 34052 1727204419.00904: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204419.00913: Set connection var ansible_shell_executable to /bin/sh 34052 1727204419.00937: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.00940: variable 'ansible_connection' from source: unknown 34052 1727204419.00944: variable 'ansible_module_compression' from source: unknown 34052 1727204419.00946: variable 'ansible_shell_type' from source: unknown 34052 1727204419.00949: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.00951: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.00954: variable 'ansible_pipelining' from source: unknown 34052 1727204419.00956: variable 'ansible_timeout' from source: unknown 34052 1727204419.00962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.01080: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204419.01089: variable 'omit' from source: magic vars 34052 1727204419.01095: starting attempt loop 34052 1727204419.01098: running the handler 34052 1727204419.01113: _low_level_execute_command(): starting 34052 1727204419.01121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204419.01701: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204419.01708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204419.01712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.01769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204419.01772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204419.01779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.01852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204419.04358: stdout chunk (state=3): >>>/root <<< 34052 1727204419.04546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204419.04611: stderr chunk (state=3): >>><<< 34052 1727204419.04615: stdout chunk (state=3): >>><<< 34052 1727204419.04639: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204419.04656: _low_level_execute_command(): starting 34052 1727204419.04662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235 `" && echo ansible-tmp-1727204419.0464053-34352-10732053410235="` echo /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235 `" ) && sleep 0' 34052 1727204419.05182: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204419.05186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.05189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204419.05220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.05246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204419.05249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204419.05270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.05328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204419.08178: stdout chunk (state=3): >>>ansible-tmp-1727204419.0464053-34352-10732053410235=/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235 <<< 34052 1727204419.08357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204419.08423: stderr chunk (state=3): >>><<< 34052 1727204419.08427: stdout chunk (state=3): >>><<< 34052 1727204419.08447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204419.0464053-34352-10732053410235=/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204419.08480: variable 'ansible_module_compression' from source: unknown 34052 1727204419.08533: ANSIBALLZ: Using generic lock for ansible.legacy.command 34052 1727204419.08539: ANSIBALLZ: Acquiring lock 34052 1727204419.08542: ANSIBALLZ: Lock acquired: 140141530567488 34052 1727204419.08544: ANSIBALLZ: Creating module 34052 1727204419.20174: ANSIBALLZ: Writing module into payload 34052 1727204419.20223: ANSIBALLZ: Writing module 34052 1727204419.20259: ANSIBALLZ: Renaming module 34052 1727204419.20276: ANSIBALLZ: Done creating module 34052 1727204419.20301: variable 'ansible_facts' from source: unknown 34052 1727204419.20384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py 34052 1727204419.20544: Sending initial data 34052 1727204419.20559: Sent initial data (155 bytes) 34052 1727204419.21069: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204419.21089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.21150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204419.21153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204419.21156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.21219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204419.23535: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204419.23583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204419.23635: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpz9fpoxlj /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py <<< 34052 1727204419.23639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py" <<< 34052 1727204419.23706: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpz9fpoxlj" to remote "/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py" <<< 34052 1727204419.24606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204419.24657: stderr chunk (state=3): >>><<< 34052 1727204419.24661: stdout chunk (state=3): >>><<< 34052 1727204419.24663: done transferring module to remote 34052 1727204419.24684: _low_level_execute_command(): starting 34052 1727204419.24688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/ /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py && sleep 0' 34052 1727204419.25305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204419.25312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.25350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204419.27979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204419.27983: stdout chunk (state=3): >>><<< 34052 1727204419.27986: stderr chunk (state=3): >>><<< 34052 1727204419.28102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204419.28107: _low_level_execute_command(): starting 34052 1727204419.28109: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/AnsiballZ_command.py && sleep 0' 34052 1727204419.28669: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204419.28719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204419.28723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.28728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204419.28730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204419.28777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204419.28788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.28846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204419.56371: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:19.557476", "end": "2024-09-24 15:00:19.562411", "delta": "0:00:00.004935", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204419.58820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204419.58828: stdout chunk (state=3): >>><<< 34052 1727204419.58831: stderr chunk (state=3): >>><<< 34052 1727204419.58872: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:19.557476", "end": "2024-09-24 15:00:19.562411", "delta": "0:00:00.004935", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204419.58928: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204419.58932: _low_level_execute_command(): starting 34052 1727204419.59023: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204419.0464053-34352-10732053410235/ > /dev/null 2>&1 && sleep 0' 34052 1727204419.59700: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204419.59717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204419.59855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204419.59924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204419.59976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34052 1727204419.62799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204419.62830: stderr chunk (state=3): >>><<< 34052 1727204419.62906: stdout chunk (state=3): >>><<< 34052 1727204419.62928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34052 1727204419.62941: handler run complete 34052 1727204419.63029: Evaluated conditional (False): False 34052 1727204419.63032: attempt loop complete, returning result 34052 1727204419.63035: _execute() done 34052 1727204419.63279: dumping result to json 34052 1727204419.63282: done dumping result, returning 34052 1727204419.63284: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [127b8e07-fff9-66a4-e2a3-00000000013a] 34052 1727204419.63287: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000013a 34052 1727204419.63374: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000013a ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004935", "end": "2024-09-24 15:00:19.562411", "rc": 0, "start": "2024-09-24 15:00:19.557476" } STDOUT: bonding_masters eth0 lo 34052 1727204419.63460: no more pending results, returning what we have 34052 1727204419.63463: results queue empty 34052 1727204419.63464: checking for any_errors_fatal 34052 1727204419.63468: done checking for any_errors_fatal 34052 1727204419.63468: checking for max_fail_percentage 34052 1727204419.63470: done checking for max_fail_percentage 34052 1727204419.63470: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.63471: done checking to see if all hosts have failed 34052 1727204419.63472: getting the remaining hosts for this loop 34052 1727204419.63474: done getting the remaining hosts for this loop 34052 1727204419.63478: getting the next task for host managed-node1 34052 1727204419.63485: done getting next task for host managed-node1 34052 1727204419.63487: ^ task is: TASK: Set current_interfaces 34052 1727204419.63491: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.63497: getting variables 34052 1727204419.63498: in VariableManager get_vars() 34052 1727204419.63543: Calling all_inventory to load vars for managed-node1 34052 1727204419.63546: Calling groups_inventory to load vars for managed-node1 34052 1727204419.63549: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.63562: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.63775: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.63785: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.64434: WORKER PROCESS EXITING 34052 1727204419.64463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.65138: done with get_vars() 34052 1727204419.65152: done getting variables 34052 1727204419.65336: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.660) 0:00:05.970 ***** 34052 1727204419.65373: entering _queue_task() for managed-node1/set_fact 34052 1727204419.66207: worker is 1 (out of 1 available) 34052 1727204419.66220: exiting _queue_task() for managed-node1/set_fact 34052 1727204419.66233: done queuing things up, now waiting for results queue to drain 34052 1727204419.66235: waiting for pending results... 34052 1727204419.66884: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 34052 1727204419.66891: in run() - task 127b8e07-fff9-66a4-e2a3-00000000013b 34052 1727204419.66894: variable 'ansible_search_path' from source: unknown 34052 1727204419.66896: variable 'ansible_search_path' from source: unknown 34052 1727204419.67272: calling self._execute() 34052 1727204419.67276: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.67278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.67281: variable 'omit' from source: magic vars 34052 1727204419.68016: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.68292: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.68472: variable 'omit' from source: magic vars 34052 1727204419.68477: variable 'omit' from source: magic vars 34052 1727204419.68521: variable '_current_interfaces' from source: set_fact 34052 1727204419.68972: variable 'omit' from source: magic vars 34052 1727204419.68975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204419.68979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204419.68982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204419.68996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.69015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.69058: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204419.69371: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.69375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.69400: Set connection var ansible_connection to ssh 34052 1727204419.69413: Set connection var ansible_timeout to 10 34052 1727204419.69427: Set connection var ansible_pipelining to False 34052 1727204419.69435: Set connection var ansible_shell_type to sh 34052 1727204419.69452: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204419.69464: Set connection var ansible_shell_executable to /bin/sh 34052 1727204419.69499: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.69556: variable 'ansible_connection' from source: unknown 34052 1727204419.69567: variable 'ansible_module_compression' from source: unknown 34052 1727204419.69576: variable 'ansible_shell_type' from source: unknown 34052 1727204419.69584: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.69592: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.69601: variable 'ansible_pipelining' from source: unknown 34052 1727204419.69610: variable 'ansible_timeout' from source: unknown 34052 1727204419.69667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.69900: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204419.69918: variable 'omit' from source: magic vars 34052 1727204419.69929: starting attempt loop 34052 1727204419.69936: running the handler 34052 1727204419.69953: handler run complete 34052 1727204419.69970: attempt loop complete, returning result 34052 1727204419.69981: _execute() done 34052 1727204419.69995: dumping result to json 34052 1727204419.70004: done dumping result, returning 34052 1727204419.70017: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [127b8e07-fff9-66a4-e2a3-00000000013b] 34052 1727204419.70025: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000013b ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34052 1727204419.70210: no more pending results, returning what we have 34052 1727204419.70214: results queue empty 34052 1727204419.70214: checking for any_errors_fatal 34052 1727204419.70224: done checking for any_errors_fatal 34052 1727204419.70227: checking for max_fail_percentage 34052 1727204419.70230: done checking for max_fail_percentage 34052 1727204419.70230: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.70231: done checking to see if all hosts have failed 34052 1727204419.70232: getting the remaining hosts for this loop 34052 1727204419.70233: done getting the remaining hosts for this loop 34052 1727204419.70237: getting the next task for host managed-node1 34052 1727204419.70246: done getting next task for host managed-node1 34052 1727204419.70249: ^ task is: TASK: Show current_interfaces 34052 1727204419.70252: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.70258: getting variables 34052 1727204419.70259: in VariableManager get_vars() 34052 1727204419.70305: Calling all_inventory to load vars for managed-node1 34052 1727204419.70308: Calling groups_inventory to load vars for managed-node1 34052 1727204419.70310: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.70478: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.70482: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.70486: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.70679: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000013b 34052 1727204419.70683: WORKER PROCESS EXITING 34052 1727204419.70710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.70963: done with get_vars() 34052 1727204419.70978: done getting variables 34052 1727204419.71087: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.057) 0:00:06.028 ***** 34052 1727204419.71120: entering _queue_task() for managed-node1/debug 34052 1727204419.71122: Creating lock for debug 34052 1727204419.71482: worker is 1 (out of 1 available) 34052 1727204419.71498: exiting _queue_task() for managed-node1/debug 34052 1727204419.71511: done queuing things up, now waiting for results queue to drain 34052 1727204419.71513: waiting for pending results... 34052 1727204419.71827: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 34052 1727204419.71951: in run() - task 127b8e07-fff9-66a4-e2a3-00000000012c 34052 1727204419.71981: variable 'ansible_search_path' from source: unknown 34052 1727204419.71995: variable 'ansible_search_path' from source: unknown 34052 1727204419.72048: calling self._execute() 34052 1727204419.72157: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.72171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.72185: variable 'omit' from source: magic vars 34052 1727204419.72611: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.72791: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.72809: variable 'omit' from source: magic vars 34052 1727204419.72861: variable 'omit' from source: magic vars 34052 1727204419.73104: variable 'current_interfaces' from source: set_fact 34052 1727204419.73213: variable 'omit' from source: magic vars 34052 1727204419.73451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204419.73454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204419.73457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204419.73577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.73597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204419.73641: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204419.73651: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.73660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.73902: Set connection var ansible_connection to ssh 34052 1727204419.73945: Set connection var ansible_timeout to 10 34052 1727204419.74008: Set connection var ansible_pipelining to False 34052 1727204419.74039: Set connection var ansible_shell_type to sh 34052 1727204419.74043: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204419.74045: Set connection var ansible_shell_executable to /bin/sh 34052 1727204419.74092: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.74133: variable 'ansible_connection' from source: unknown 34052 1727204419.74145: variable 'ansible_module_compression' from source: unknown 34052 1727204419.74159: variable 'ansible_shell_type' from source: unknown 34052 1727204419.74170: variable 'ansible_shell_executable' from source: unknown 34052 1727204419.74185: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.74194: variable 'ansible_pipelining' from source: unknown 34052 1727204419.74201: variable 'ansible_timeout' from source: unknown 34052 1727204419.74209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.74381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204419.74471: variable 'omit' from source: magic vars 34052 1727204419.74474: starting attempt loop 34052 1727204419.74478: running the handler 34052 1727204419.74481: handler run complete 34052 1727204419.74499: attempt loop complete, returning result 34052 1727204419.74597: _execute() done 34052 1727204419.74600: dumping result to json 34052 1727204419.74602: done dumping result, returning 34052 1727204419.74605: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [127b8e07-fff9-66a4-e2a3-00000000012c] 34052 1727204419.74607: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000012c 34052 1727204419.74692: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000012c 34052 1727204419.74696: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34052 1727204419.74758: no more pending results, returning what we have 34052 1727204419.74762: results queue empty 34052 1727204419.74763: checking for any_errors_fatal 34052 1727204419.74771: done checking for any_errors_fatal 34052 1727204419.74772: checking for max_fail_percentage 34052 1727204419.74773: done checking for max_fail_percentage 34052 1727204419.74774: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.74775: done checking to see if all hosts have failed 34052 1727204419.74776: getting the remaining hosts for this loop 34052 1727204419.74778: done getting the remaining hosts for this loop 34052 1727204419.74783: getting the next task for host managed-node1 34052 1727204419.74792: done getting next task for host managed-node1 34052 1727204419.74795: ^ task is: TASK: Include the task 'manage_test_interface.yml' 34052 1727204419.74797: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.74803: getting variables 34052 1727204419.74805: in VariableManager get_vars() 34052 1727204419.74855: Calling all_inventory to load vars for managed-node1 34052 1727204419.74859: Calling groups_inventory to load vars for managed-node1 34052 1727204419.74862: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.74979: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.74983: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.74987: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.75444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.75602: done with get_vars() 34052 1727204419.75613: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.045) 0:00:06.074 ***** 34052 1727204419.75698: entering _queue_task() for managed-node1/include_tasks 34052 1727204419.75954: worker is 1 (out of 1 available) 34052 1727204419.75973: exiting _queue_task() for managed-node1/include_tasks 34052 1727204419.75987: done queuing things up, now waiting for results queue to drain 34052 1727204419.75989: waiting for pending results... 34052 1727204419.76258: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 34052 1727204419.76283: in run() - task 127b8e07-fff9-66a4-e2a3-00000000000c 34052 1727204419.76374: variable 'ansible_search_path' from source: unknown 34052 1727204419.76378: calling self._execute() 34052 1727204419.76452: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.76462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.76479: variable 'omit' from source: magic vars 34052 1727204419.77114: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.77137: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.77149: _execute() done 34052 1727204419.77158: dumping result to json 34052 1727204419.77204: done dumping result, returning 34052 1727204419.77210: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-66a4-e2a3-00000000000c] 34052 1727204419.77213: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000c 34052 1727204419.77548: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000c 34052 1727204419.77553: WORKER PROCESS EXITING 34052 1727204419.77587: no more pending results, returning what we have 34052 1727204419.77593: in VariableManager get_vars() 34052 1727204419.77651: Calling all_inventory to load vars for managed-node1 34052 1727204419.77655: Calling groups_inventory to load vars for managed-node1 34052 1727204419.77657: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.77676: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.77679: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.77683: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.78115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.78419: done with get_vars() 34052 1727204419.78429: variable 'ansible_search_path' from source: unknown 34052 1727204419.78446: we have included files to process 34052 1727204419.78447: generating all_blocks data 34052 1727204419.78449: done generating all_blocks data 34052 1727204419.78454: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204419.78455: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204419.78458: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204419.79121: in VariableManager get_vars() 34052 1727204419.79211: done with get_vars() 34052 1727204419.79512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34052 1727204419.80142: done processing included file 34052 1727204419.80145: iterating over new_blocks loaded from include file 34052 1727204419.80153: in VariableManager get_vars() 34052 1727204419.80178: done with get_vars() 34052 1727204419.80180: filtering new block on tags 34052 1727204419.80215: done filtering new block on tags 34052 1727204419.80218: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 34052 1727204419.80470: extending task lists for all hosts with included blocks 34052 1727204419.80657: done extending task lists 34052 1727204419.80658: done processing included files 34052 1727204419.80659: results queue empty 34052 1727204419.80660: checking for any_errors_fatal 34052 1727204419.80664: done checking for any_errors_fatal 34052 1727204419.80665: checking for max_fail_percentage 34052 1727204419.80668: done checking for max_fail_percentage 34052 1727204419.80669: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.80670: done checking to see if all hosts have failed 34052 1727204419.80671: getting the remaining hosts for this loop 34052 1727204419.80672: done getting the remaining hosts for this loop 34052 1727204419.80674: getting the next task for host managed-node1 34052 1727204419.80678: done getting next task for host managed-node1 34052 1727204419.80681: ^ task is: TASK: Ensure state in ["present", "absent"] 34052 1727204419.80684: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.80686: getting variables 34052 1727204419.80688: in VariableManager get_vars() 34052 1727204419.80713: Calling all_inventory to load vars for managed-node1 34052 1727204419.80716: Calling groups_inventory to load vars for managed-node1 34052 1727204419.80718: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.80727: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.80730: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.80733: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.80939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.81203: done with get_vars() 34052 1727204419.81218: done getting variables 34052 1727204419.81313: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.056) 0:00:06.130 ***** 34052 1727204419.81355: entering _queue_task() for managed-node1/fail 34052 1727204419.81357: Creating lock for fail 34052 1727204419.81758: worker is 1 (out of 1 available) 34052 1727204419.81774: exiting _queue_task() for managed-node1/fail 34052 1727204419.81974: done queuing things up, now waiting for results queue to drain 34052 1727204419.81976: waiting for pending results... 34052 1727204419.82190: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 34052 1727204419.82297: in run() - task 127b8e07-fff9-66a4-e2a3-000000000156 34052 1727204419.82337: variable 'ansible_search_path' from source: unknown 34052 1727204419.82345: variable 'ansible_search_path' from source: unknown 34052 1727204419.82418: calling self._execute() 34052 1727204419.82554: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.82632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.82636: variable 'omit' from source: magic vars 34052 1727204419.83074: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.83115: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.83439: variable 'state' from source: include params 34052 1727204419.83501: Evaluated conditional (state not in ["present", "absent"]): False 34052 1727204419.83505: when evaluation is False, skipping this task 34052 1727204419.83507: _execute() done 34052 1727204419.83509: dumping result to json 34052 1727204419.83511: done dumping result, returning 34052 1727204419.83541: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-66a4-e2a3-000000000156] 34052 1727204419.83547: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000156 34052 1727204419.83844: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000156 34052 1727204419.83847: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 34052 1727204419.83919: no more pending results, returning what we have 34052 1727204419.83924: results queue empty 34052 1727204419.83924: checking for any_errors_fatal 34052 1727204419.83929: done checking for any_errors_fatal 34052 1727204419.83930: checking for max_fail_percentage 34052 1727204419.83931: done checking for max_fail_percentage 34052 1727204419.83932: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.83932: done checking to see if all hosts have failed 34052 1727204419.83933: getting the remaining hosts for this loop 34052 1727204419.83935: done getting the remaining hosts for this loop 34052 1727204419.83939: getting the next task for host managed-node1 34052 1727204419.83946: done getting next task for host managed-node1 34052 1727204419.83949: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 34052 1727204419.83952: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.83957: getting variables 34052 1727204419.83959: in VariableManager get_vars() 34052 1727204419.84009: Calling all_inventory to load vars for managed-node1 34052 1727204419.84013: Calling groups_inventory to load vars for managed-node1 34052 1727204419.84015: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.84033: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.84036: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.84040: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.85068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.85518: done with get_vars() 34052 1727204419.85548: done getting variables 34052 1727204419.85622: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.043) 0:00:06.173 ***** 34052 1727204419.85673: entering _queue_task() for managed-node1/fail 34052 1727204419.86044: worker is 1 (out of 1 available) 34052 1727204419.86057: exiting _queue_task() for managed-node1/fail 34052 1727204419.86075: done queuing things up, now waiting for results queue to drain 34052 1727204419.86190: waiting for pending results... 34052 1727204419.86486: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 34052 1727204419.86584: in run() - task 127b8e07-fff9-66a4-e2a3-000000000157 34052 1727204419.86587: variable 'ansible_search_path' from source: unknown 34052 1727204419.86591: variable 'ansible_search_path' from source: unknown 34052 1727204419.86612: calling self._execute() 34052 1727204419.86724: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.86754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.86801: variable 'omit' from source: magic vars 34052 1727204419.87301: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.87304: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.87673: variable 'type' from source: play vars 34052 1727204419.87689: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 34052 1727204419.87696: when evaluation is False, skipping this task 34052 1727204419.87702: _execute() done 34052 1727204419.87708: dumping result to json 34052 1727204419.87719: done dumping result, returning 34052 1727204419.87735: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-66a4-e2a3-000000000157] 34052 1727204419.87745: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000157 34052 1727204419.87990: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000157 34052 1727204419.87994: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 34052 1727204419.88063: no more pending results, returning what we have 34052 1727204419.88070: results queue empty 34052 1727204419.88071: checking for any_errors_fatal 34052 1727204419.88081: done checking for any_errors_fatal 34052 1727204419.88082: checking for max_fail_percentage 34052 1727204419.88084: done checking for max_fail_percentage 34052 1727204419.88084: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.88085: done checking to see if all hosts have failed 34052 1727204419.88086: getting the remaining hosts for this loop 34052 1727204419.88088: done getting the remaining hosts for this loop 34052 1727204419.88093: getting the next task for host managed-node1 34052 1727204419.88101: done getting next task for host managed-node1 34052 1727204419.88104: ^ task is: TASK: Include the task 'show_interfaces.yml' 34052 1727204419.88108: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.88113: getting variables 34052 1727204419.88115: in VariableManager get_vars() 34052 1727204419.88283: Calling all_inventory to load vars for managed-node1 34052 1727204419.88286: Calling groups_inventory to load vars for managed-node1 34052 1727204419.88289: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.88300: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.88302: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.88305: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.88782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.89337: done with get_vars() 34052 1727204419.89355: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.039) 0:00:06.213 ***** 34052 1727204419.89632: entering _queue_task() for managed-node1/include_tasks 34052 1727204419.90684: worker is 1 (out of 1 available) 34052 1727204419.90868: exiting _queue_task() for managed-node1/include_tasks 34052 1727204419.90884: done queuing things up, now waiting for results queue to drain 34052 1727204419.90886: waiting for pending results... 34052 1727204419.91488: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 34052 1727204419.91993: in run() - task 127b8e07-fff9-66a4-e2a3-000000000158 34052 1727204419.91998: variable 'ansible_search_path' from source: unknown 34052 1727204419.92002: variable 'ansible_search_path' from source: unknown 34052 1727204419.92004: calling self._execute() 34052 1727204419.92098: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.92114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.92134: variable 'omit' from source: magic vars 34052 1727204419.93085: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.93106: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.93119: _execute() done 34052 1727204419.93130: dumping result to json 34052 1727204419.93138: done dumping result, returning 34052 1727204419.93149: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-66a4-e2a3-000000000158] 34052 1727204419.93158: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000158 34052 1727204419.93304: no more pending results, returning what we have 34052 1727204419.93311: in VariableManager get_vars() 34052 1727204419.93368: Calling all_inventory to load vars for managed-node1 34052 1727204419.93371: Calling groups_inventory to load vars for managed-node1 34052 1727204419.93373: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.93391: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.93394: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.93399: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.93753: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000158 34052 1727204419.93756: WORKER PROCESS EXITING 34052 1727204419.93784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.94072: done with get_vars() 34052 1727204419.94082: variable 'ansible_search_path' from source: unknown 34052 1727204419.94083: variable 'ansible_search_path' from source: unknown 34052 1727204419.94123: we have included files to process 34052 1727204419.94124: generating all_blocks data 34052 1727204419.94129: done generating all_blocks data 34052 1727204419.94134: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204419.94135: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204419.94137: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204419.94274: in VariableManager get_vars() 34052 1727204419.94304: done with get_vars() 34052 1727204419.94445: done processing included file 34052 1727204419.94447: iterating over new_blocks loaded from include file 34052 1727204419.94449: in VariableManager get_vars() 34052 1727204419.94469: done with get_vars() 34052 1727204419.94471: filtering new block on tags 34052 1727204419.94494: done filtering new block on tags 34052 1727204419.94497: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 34052 1727204419.94502: extending task lists for all hosts with included blocks 34052 1727204419.94957: done extending task lists 34052 1727204419.94958: done processing included files 34052 1727204419.94959: results queue empty 34052 1727204419.94960: checking for any_errors_fatal 34052 1727204419.94963: done checking for any_errors_fatal 34052 1727204419.94964: checking for max_fail_percentage 34052 1727204419.94967: done checking for max_fail_percentage 34052 1727204419.94968: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.94969: done checking to see if all hosts have failed 34052 1727204419.94970: getting the remaining hosts for this loop 34052 1727204419.94971: done getting the remaining hosts for this loop 34052 1727204419.94973: getting the next task for host managed-node1 34052 1727204419.94978: done getting next task for host managed-node1 34052 1727204419.94980: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34052 1727204419.94983: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.94986: getting variables 34052 1727204419.94987: in VariableManager get_vars() 34052 1727204419.95001: Calling all_inventory to load vars for managed-node1 34052 1727204419.95004: Calling groups_inventory to load vars for managed-node1 34052 1727204419.95006: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.95012: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.95014: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.95017: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.95217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.95507: done with get_vars() 34052 1727204419.95517: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.059) 0:00:06.273 ***** 34052 1727204419.95615: entering _queue_task() for managed-node1/include_tasks 34052 1727204419.96053: worker is 1 (out of 1 available) 34052 1727204419.96069: exiting _queue_task() for managed-node1/include_tasks 34052 1727204419.96194: done queuing things up, now waiting for results queue to drain 34052 1727204419.96196: waiting for pending results... 34052 1727204419.96424: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 34052 1727204419.96476: in run() - task 127b8e07-fff9-66a4-e2a3-00000000017f 34052 1727204419.96495: variable 'ansible_search_path' from source: unknown 34052 1727204419.96502: variable 'ansible_search_path' from source: unknown 34052 1727204419.96553: calling self._execute() 34052 1727204419.96670: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204419.96673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204419.96678: variable 'omit' from source: magic vars 34052 1727204419.97141: variable 'ansible_distribution_major_version' from source: facts 34052 1727204419.97171: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204419.97191: _execute() done 34052 1727204419.97195: dumping result to json 34052 1727204419.97271: done dumping result, returning 34052 1727204419.97275: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-66a4-e2a3-00000000017f] 34052 1727204419.97282: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000017f 34052 1727204419.97370: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000017f 34052 1727204419.97374: WORKER PROCESS EXITING 34052 1727204419.97410: no more pending results, returning what we have 34052 1727204419.97416: in VariableManager get_vars() 34052 1727204419.97477: Calling all_inventory to load vars for managed-node1 34052 1727204419.97481: Calling groups_inventory to load vars for managed-node1 34052 1727204419.97484: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.97621: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.97628: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.97634: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.98063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.98337: done with get_vars() 34052 1727204419.98347: variable 'ansible_search_path' from source: unknown 34052 1727204419.98349: variable 'ansible_search_path' from source: unknown 34052 1727204419.98423: we have included files to process 34052 1727204419.98424: generating all_blocks data 34052 1727204419.98429: done generating all_blocks data 34052 1727204419.98431: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204419.98432: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204419.98434: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204419.98739: done processing included file 34052 1727204419.98741: iterating over new_blocks loaded from include file 34052 1727204419.98743: in VariableManager get_vars() 34052 1727204419.98770: done with get_vars() 34052 1727204419.98772: filtering new block on tags 34052 1727204419.98792: done filtering new block on tags 34052 1727204419.98794: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 34052 1727204419.98800: extending task lists for all hosts with included blocks 34052 1727204419.98976: done extending task lists 34052 1727204419.98978: done processing included files 34052 1727204419.98979: results queue empty 34052 1727204419.98980: checking for any_errors_fatal 34052 1727204419.98983: done checking for any_errors_fatal 34052 1727204419.98984: checking for max_fail_percentage 34052 1727204419.98985: done checking for max_fail_percentage 34052 1727204419.98986: checking to see if all hosts have failed and the running result is not ok 34052 1727204419.98986: done checking to see if all hosts have failed 34052 1727204419.98990: getting the remaining hosts for this loop 34052 1727204419.98992: done getting the remaining hosts for this loop 34052 1727204419.98995: getting the next task for host managed-node1 34052 1727204419.98999: done getting next task for host managed-node1 34052 1727204419.99002: ^ task is: TASK: Gather current interface info 34052 1727204419.99005: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204419.99008: getting variables 34052 1727204419.99009: in VariableManager get_vars() 34052 1727204419.99023: Calling all_inventory to load vars for managed-node1 34052 1727204419.99029: Calling groups_inventory to load vars for managed-node1 34052 1727204419.99031: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204419.99037: Calling all_plugins_play to load vars for managed-node1 34052 1727204419.99039: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204419.99042: Calling groups_plugins_play to load vars for managed-node1 34052 1727204419.99257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204419.99522: done with get_vars() 34052 1727204419.99533: done getting variables 34052 1727204419.99581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:19 -0400 (0:00:00.039) 0:00:06.313 ***** 34052 1727204419.99617: entering _queue_task() for managed-node1/command 34052 1727204419.99960: worker is 1 (out of 1 available) 34052 1727204419.99978: exiting _queue_task() for managed-node1/command 34052 1727204419.99993: done queuing things up, now waiting for results queue to drain 34052 1727204419.99995: waiting for pending results... 34052 1727204420.00533: running TaskExecutor() for managed-node1/TASK: Gather current interface info 34052 1727204420.00539: in run() - task 127b8e07-fff9-66a4-e2a3-0000000001b6 34052 1727204420.00543: variable 'ansible_search_path' from source: unknown 34052 1727204420.00546: variable 'ansible_search_path' from source: unknown 34052 1727204420.00607: calling self._execute() 34052 1727204420.00703: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.00785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.00789: variable 'omit' from source: magic vars 34052 1727204420.01155: variable 'ansible_distribution_major_version' from source: facts 34052 1727204420.01179: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204420.01192: variable 'omit' from source: magic vars 34052 1727204420.01258: variable 'omit' from source: magic vars 34052 1727204420.01306: variable 'omit' from source: magic vars 34052 1727204420.01354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204420.01408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204420.01472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204420.01475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.01480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.01518: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204420.01527: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.01536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.01655: Set connection var ansible_connection to ssh 34052 1727204420.01671: Set connection var ansible_timeout to 10 34052 1727204420.01690: Set connection var ansible_pipelining to False 34052 1727204420.01693: Set connection var ansible_shell_type to sh 34052 1727204420.01772: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204420.01775: Set connection var ansible_shell_executable to /bin/sh 34052 1727204420.01778: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.01781: variable 'ansible_connection' from source: unknown 34052 1727204420.01783: variable 'ansible_module_compression' from source: unknown 34052 1727204420.01785: variable 'ansible_shell_type' from source: unknown 34052 1727204420.01787: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.01789: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.01791: variable 'ansible_pipelining' from source: unknown 34052 1727204420.01796: variable 'ansible_timeout' from source: unknown 34052 1727204420.01801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.01964: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204420.01987: variable 'omit' from source: magic vars 34052 1727204420.01998: starting attempt loop 34052 1727204420.02005: running the handler 34052 1727204420.02033: _low_level_execute_command(): starting 34052 1727204420.02070: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204420.02901: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.02964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.02970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.02988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.03070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.04842: stdout chunk (state=3): >>>/root <<< 34052 1727204420.04986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.04994: stderr chunk (state=3): >>><<< 34052 1727204420.04998: stdout chunk (state=3): >>><<< 34052 1727204420.05019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.05034: _low_level_execute_command(): starting 34052 1727204420.05041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050 `" && echo ansible-tmp-1727204420.050193-34419-9779198465050="` echo /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050 `" ) && sleep 0' 34052 1727204420.05517: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.05521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.05524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.05540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.05588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.05592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.05651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.07719: stdout chunk (state=3): >>>ansible-tmp-1727204420.050193-34419-9779198465050=/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050 <<< 34052 1727204420.07828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.07889: stderr chunk (state=3): >>><<< 34052 1727204420.07893: stdout chunk (state=3): >>><<< 34052 1727204420.07910: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204420.050193-34419-9779198465050=/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.07945: variable 'ansible_module_compression' from source: unknown 34052 1727204420.07990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204420.08023: variable 'ansible_facts' from source: unknown 34052 1727204420.08086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py 34052 1727204420.08200: Sending initial data 34052 1727204420.08203: Sent initial data (153 bytes) 34052 1727204420.08703: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.08708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.08711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.08714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204420.08716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.08768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.08772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.08778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.08836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.10530: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204420.10575: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204420.10622: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmph1er71p9 /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py <<< 34052 1727204420.10627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py" <<< 34052 1727204420.10674: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmph1er71p9" to remote "/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py" <<< 34052 1727204420.11257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.11340: stderr chunk (state=3): >>><<< 34052 1727204420.11343: stdout chunk (state=3): >>><<< 34052 1727204420.11363: done transferring module to remote 34052 1727204420.11376: _low_level_execute_command(): starting 34052 1727204420.11381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/ /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py && sleep 0' 34052 1727204420.11859: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.11873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.11887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204420.11897: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.11954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.11957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.12011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.14073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.14077: stderr chunk (state=3): >>><<< 34052 1727204420.14080: stdout chunk (state=3): >>><<< 34052 1727204420.14082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.14085: _low_level_execute_command(): starting 34052 1727204420.14087: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/AnsiballZ_command.py && sleep 0' 34052 1727204420.14663: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204420.14672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.14683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.14752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204420.14756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204420.14759: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204420.14761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.14764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204420.14768: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204420.14770: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204420.14772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.14773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.14776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204420.14783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204420.14790: stderr chunk (state=3): >>>debug2: match found <<< 34052 1727204420.14799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.14870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.14919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.14923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.14986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.32444: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:20.319817", "end": "2024-09-24 15:00:20.323409", "delta": "0:00:00.003592", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204420.34263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.34596: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204420.34600: stdout chunk (state=3): >>><<< 34052 1727204420.34603: stderr chunk (state=3): >>><<< 34052 1727204420.34606: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:20.319817", "end": "2024-09-24 15:00:20.323409", "delta": "0:00:00.003592", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204420.34773: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204420.34777: _low_level_execute_command(): starting 34052 1727204420.34781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204420.050193-34419-9779198465050/ > /dev/null 2>&1 && sleep 0' 34052 1727204420.36067: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204420.36193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.36216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.36382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.36463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.38547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.38571: stdout chunk (state=3): >>><<< 34052 1727204420.38872: stderr chunk (state=3): >>><<< 34052 1727204420.38876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.38878: handler run complete 34052 1727204420.38881: Evaluated conditional (False): False 34052 1727204420.38883: attempt loop complete, returning result 34052 1727204420.38885: _execute() done 34052 1727204420.38887: dumping result to json 34052 1727204420.38889: done dumping result, returning 34052 1727204420.38891: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [127b8e07-fff9-66a4-e2a3-0000000001b6] 34052 1727204420.38893: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001b6 34052 1727204420.39088: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001b6 34052 1727204420.39091: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003592", "end": "2024-09-24 15:00:20.323409", "rc": 0, "start": "2024-09-24 15:00:20.319817" } STDOUT: bonding_masters eth0 lo 34052 1727204420.39193: no more pending results, returning what we have 34052 1727204420.39197: results queue empty 34052 1727204420.39198: checking for any_errors_fatal 34052 1727204420.39204: done checking for any_errors_fatal 34052 1727204420.39206: checking for max_fail_percentage 34052 1727204420.39208: done checking for max_fail_percentage 34052 1727204420.39209: checking to see if all hosts have failed and the running result is not ok 34052 1727204420.39210: done checking to see if all hosts have failed 34052 1727204420.39211: getting the remaining hosts for this loop 34052 1727204420.39213: done getting the remaining hosts for this loop 34052 1727204420.39217: getting the next task for host managed-node1 34052 1727204420.39228: done getting next task for host managed-node1 34052 1727204420.39231: ^ task is: TASK: Set current_interfaces 34052 1727204420.39237: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204420.39242: getting variables 34052 1727204420.39244: in VariableManager get_vars() 34052 1727204420.39627: Calling all_inventory to load vars for managed-node1 34052 1727204420.39630: Calling groups_inventory to load vars for managed-node1 34052 1727204420.39633: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204420.39646: Calling all_plugins_play to load vars for managed-node1 34052 1727204420.39648: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204420.39652: Calling groups_plugins_play to load vars for managed-node1 34052 1727204420.40023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204420.40357: done with get_vars() 34052 1727204420.40375: done getting variables 34052 1727204420.40443: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:20 -0400 (0:00:00.408) 0:00:06.722 ***** 34052 1727204420.40487: entering _queue_task() for managed-node1/set_fact 34052 1727204420.40843: worker is 1 (out of 1 available) 34052 1727204420.40857: exiting _queue_task() for managed-node1/set_fact 34052 1727204420.41016: done queuing things up, now waiting for results queue to drain 34052 1727204420.41019: waiting for pending results... 34052 1727204420.41259: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 34052 1727204420.41286: in run() - task 127b8e07-fff9-66a4-e2a3-0000000001b7 34052 1727204420.41309: variable 'ansible_search_path' from source: unknown 34052 1727204420.41316: variable 'ansible_search_path' from source: unknown 34052 1727204420.41369: calling self._execute() 34052 1727204420.41473: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.41486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.41509: variable 'omit' from source: magic vars 34052 1727204420.41953: variable 'ansible_distribution_major_version' from source: facts 34052 1727204420.41973: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204420.42010: variable 'omit' from source: magic vars 34052 1727204420.42063: variable 'omit' from source: magic vars 34052 1727204420.42197: variable '_current_interfaces' from source: set_fact 34052 1727204420.42278: variable 'omit' from source: magic vars 34052 1727204420.42335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204420.42444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204420.42448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204420.42450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.42452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.42487: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204420.42495: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.42503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.42622: Set connection var ansible_connection to ssh 34052 1727204420.42638: Set connection var ansible_timeout to 10 34052 1727204420.42648: Set connection var ansible_pipelining to False 34052 1727204420.42659: Set connection var ansible_shell_type to sh 34052 1727204420.42674: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204420.42690: Set connection var ansible_shell_executable to /bin/sh 34052 1727204420.42721: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.42732: variable 'ansible_connection' from source: unknown 34052 1727204420.42738: variable 'ansible_module_compression' from source: unknown 34052 1727204420.42769: variable 'ansible_shell_type' from source: unknown 34052 1727204420.42771: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.42775: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.42777: variable 'ansible_pipelining' from source: unknown 34052 1727204420.42779: variable 'ansible_timeout' from source: unknown 34052 1727204420.42781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.42948: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204420.42991: variable 'omit' from source: magic vars 34052 1727204420.42994: starting attempt loop 34052 1727204420.42997: running the handler 34052 1727204420.42999: handler run complete 34052 1727204420.43016: attempt loop complete, returning result 34052 1727204420.43069: _execute() done 34052 1727204420.43073: dumping result to json 34052 1727204420.43075: done dumping result, returning 34052 1727204420.43078: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [127b8e07-fff9-66a4-e2a3-0000000001b7] 34052 1727204420.43080: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001b7 ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34052 1727204420.43324: no more pending results, returning what we have 34052 1727204420.43330: results queue empty 34052 1727204420.43331: checking for any_errors_fatal 34052 1727204420.43342: done checking for any_errors_fatal 34052 1727204420.43343: checking for max_fail_percentage 34052 1727204420.43345: done checking for max_fail_percentage 34052 1727204420.43346: checking to see if all hosts have failed and the running result is not ok 34052 1727204420.43347: done checking to see if all hosts have failed 34052 1727204420.43347: getting the remaining hosts for this loop 34052 1727204420.43349: done getting the remaining hosts for this loop 34052 1727204420.43354: getting the next task for host managed-node1 34052 1727204420.43363: done getting next task for host managed-node1 34052 1727204420.43368: ^ task is: TASK: Show current_interfaces 34052 1727204420.43373: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204420.43378: getting variables 34052 1727204420.43379: in VariableManager get_vars() 34052 1727204420.43577: Calling all_inventory to load vars for managed-node1 34052 1727204420.43581: Calling groups_inventory to load vars for managed-node1 34052 1727204420.43584: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204420.43590: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001b7 34052 1727204420.43593: WORKER PROCESS EXITING 34052 1727204420.43603: Calling all_plugins_play to load vars for managed-node1 34052 1727204420.43606: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204420.43610: Calling groups_plugins_play to load vars for managed-node1 34052 1727204420.43913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204420.44179: done with get_vars() 34052 1727204420.44193: done getting variables 34052 1727204420.44264: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:20 -0400 (0:00:00.038) 0:00:06.760 ***** 34052 1727204420.44300: entering _queue_task() for managed-node1/debug 34052 1727204420.44777: worker is 1 (out of 1 available) 34052 1727204420.44792: exiting _queue_task() for managed-node1/debug 34052 1727204420.44806: done queuing things up, now waiting for results queue to drain 34052 1727204420.44809: waiting for pending results... 34052 1727204420.45021: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 34052 1727204420.45172: in run() - task 127b8e07-fff9-66a4-e2a3-000000000180 34052 1727204420.45198: variable 'ansible_search_path' from source: unknown 34052 1727204420.45213: variable 'ansible_search_path' from source: unknown 34052 1727204420.45271: calling self._execute() 34052 1727204420.45389: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.45402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.45417: variable 'omit' from source: magic vars 34052 1727204420.45904: variable 'ansible_distribution_major_version' from source: facts 34052 1727204420.45928: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204420.45942: variable 'omit' from source: magic vars 34052 1727204420.46005: variable 'omit' from source: magic vars 34052 1727204420.46129: variable 'current_interfaces' from source: set_fact 34052 1727204420.46163: variable 'omit' from source: magic vars 34052 1727204420.46218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204420.46265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204420.46293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204420.46329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.46352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.46391: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204420.46412: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.46415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.46632: Set connection var ansible_connection to ssh 34052 1727204420.46635: Set connection var ansible_timeout to 10 34052 1727204420.46638: Set connection var ansible_pipelining to False 34052 1727204420.46640: Set connection var ansible_shell_type to sh 34052 1727204420.46643: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204420.46645: Set connection var ansible_shell_executable to /bin/sh 34052 1727204420.46648: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.46650: variable 'ansible_connection' from source: unknown 34052 1727204420.46653: variable 'ansible_module_compression' from source: unknown 34052 1727204420.46655: variable 'ansible_shell_type' from source: unknown 34052 1727204420.46657: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.46659: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.46662: variable 'ansible_pipelining' from source: unknown 34052 1727204420.46664: variable 'ansible_timeout' from source: unknown 34052 1727204420.46674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.46843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204420.46864: variable 'omit' from source: magic vars 34052 1727204420.46954: starting attempt loop 34052 1727204420.46959: running the handler 34052 1727204420.46962: handler run complete 34052 1727204420.46965: attempt loop complete, returning result 34052 1727204420.46975: _execute() done 34052 1727204420.46983: dumping result to json 34052 1727204420.46991: done dumping result, returning 34052 1727204420.47002: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [127b8e07-fff9-66a4-e2a3-000000000180] 34052 1727204420.47011: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000180 34052 1727204420.47231: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000180 34052 1727204420.47235: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34052 1727204420.47296: no more pending results, returning what we have 34052 1727204420.47299: results queue empty 34052 1727204420.47300: checking for any_errors_fatal 34052 1727204420.47306: done checking for any_errors_fatal 34052 1727204420.47307: checking for max_fail_percentage 34052 1727204420.47309: done checking for max_fail_percentage 34052 1727204420.47310: checking to see if all hosts have failed and the running result is not ok 34052 1727204420.47311: done checking to see if all hosts have failed 34052 1727204420.47311: getting the remaining hosts for this loop 34052 1727204420.47313: done getting the remaining hosts for this loop 34052 1727204420.47319: getting the next task for host managed-node1 34052 1727204420.47330: done getting next task for host managed-node1 34052 1727204420.47333: ^ task is: TASK: Install iproute 34052 1727204420.47337: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204420.47457: getting variables 34052 1727204420.47460: in VariableManager get_vars() 34052 1727204420.47504: Calling all_inventory to load vars for managed-node1 34052 1727204420.47507: Calling groups_inventory to load vars for managed-node1 34052 1727204420.47509: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204420.47521: Calling all_plugins_play to load vars for managed-node1 34052 1727204420.47524: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204420.47529: Calling groups_plugins_play to load vars for managed-node1 34052 1727204420.47888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204420.48144: done with get_vars() 34052 1727204420.48155: done getting variables 34052 1727204420.48213: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:20 -0400 (0:00:00.039) 0:00:06.799 ***** 34052 1727204420.48248: entering _queue_task() for managed-node1/package 34052 1727204420.48685: worker is 1 (out of 1 available) 34052 1727204420.48697: exiting _queue_task() for managed-node1/package 34052 1727204420.48709: done queuing things up, now waiting for results queue to drain 34052 1727204420.48711: waiting for pending results... 34052 1727204420.48951: running TaskExecutor() for managed-node1/TASK: Install iproute 34052 1727204420.49047: in run() - task 127b8e07-fff9-66a4-e2a3-000000000159 34052 1727204420.49051: variable 'ansible_search_path' from source: unknown 34052 1727204420.49054: variable 'ansible_search_path' from source: unknown 34052 1727204420.49061: calling self._execute() 34052 1727204420.49153: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.49169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.49184: variable 'omit' from source: magic vars 34052 1727204420.49601: variable 'ansible_distribution_major_version' from source: facts 34052 1727204420.49622: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204420.49700: variable 'omit' from source: magic vars 34052 1727204420.49705: variable 'omit' from source: magic vars 34052 1727204420.49903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204420.52422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204420.52484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204420.52570: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204420.52581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204420.52611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204420.52734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204420.52782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204420.52864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204420.52867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204420.52889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204420.53016: variable '__network_is_ostree' from source: set_fact 34052 1727204420.53030: variable 'omit' from source: magic vars 34052 1727204420.53068: variable 'omit' from source: magic vars 34052 1727204420.53170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204420.53174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204420.53176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204420.53196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.53215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204420.53252: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204420.53260: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.53270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.53386: Set connection var ansible_connection to ssh 34052 1727204420.53400: Set connection var ansible_timeout to 10 34052 1727204420.53422: Set connection var ansible_pipelining to False 34052 1727204420.53428: Set connection var ansible_shell_type to sh 34052 1727204420.53471: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204420.53474: Set connection var ansible_shell_executable to /bin/sh 34052 1727204420.53481: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.53488: variable 'ansible_connection' from source: unknown 34052 1727204420.53494: variable 'ansible_module_compression' from source: unknown 34052 1727204420.53500: variable 'ansible_shell_type' from source: unknown 34052 1727204420.53507: variable 'ansible_shell_executable' from source: unknown 34052 1727204420.53531: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204420.53534: variable 'ansible_pipelining' from source: unknown 34052 1727204420.53536: variable 'ansible_timeout' from source: unknown 34052 1727204420.53540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204420.53873: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204420.53877: variable 'omit' from source: magic vars 34052 1727204420.53879: starting attempt loop 34052 1727204420.53882: running the handler 34052 1727204420.53883: variable 'ansible_facts' from source: unknown 34052 1727204420.53885: variable 'ansible_facts' from source: unknown 34052 1727204420.53887: _low_level_execute_command(): starting 34052 1727204420.53889: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204420.54497: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.54512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.54532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.54625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.56488: stdout chunk (state=3): >>>/root <<< 34052 1727204420.56505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.56605: stderr chunk (state=3): >>><<< 34052 1727204420.56622: stdout chunk (state=3): >>><<< 34052 1727204420.56652: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.56680: _low_level_execute_command(): starting 34052 1727204420.56695: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407 `" && echo ansible-tmp-1727204420.5666723-34487-223028638669407="` echo /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407 `" ) && sleep 0' 34052 1727204420.57393: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204420.57409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.57429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.57540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.57557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.57649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.59739: stdout chunk (state=3): >>>ansible-tmp-1727204420.5666723-34487-223028638669407=/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407 <<< 34052 1727204420.59959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.59963: stdout chunk (state=3): >>><<< 34052 1727204420.59968: stderr chunk (state=3): >>><<< 34052 1727204420.59996: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204420.5666723-34487-223028638669407=/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.60043: variable 'ansible_module_compression' from source: unknown 34052 1727204420.60130: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 34052 1727204420.60140: ANSIBALLZ: Acquiring lock 34052 1727204420.60372: ANSIBALLZ: Lock acquired: 140141530567488 34052 1727204420.60375: ANSIBALLZ: Creating module 34052 1727204420.77322: ANSIBALLZ: Writing module into payload 34052 1727204420.77469: ANSIBALLZ: Writing module 34052 1727204420.77490: ANSIBALLZ: Renaming module 34052 1727204420.77496: ANSIBALLZ: Done creating module 34052 1727204420.77516: variable 'ansible_facts' from source: unknown 34052 1727204420.77581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py 34052 1727204420.77698: Sending initial data 34052 1727204420.77702: Sent initial data (152 bytes) 34052 1727204420.78392: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.78412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.78441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.78522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.80288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204420.80339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204420.80395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpz0icqu7k /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py <<< 34052 1727204420.80399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py" <<< 34052 1727204420.80446: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpz0icqu7k" to remote "/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py" <<< 34052 1727204420.80449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py" <<< 34052 1727204420.81187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.81267: stderr chunk (state=3): >>><<< 34052 1727204420.81271: stdout chunk (state=3): >>><<< 34052 1727204420.81292: done transferring module to remote 34052 1727204420.81303: _low_level_execute_command(): starting 34052 1727204420.81308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/ /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py && sleep 0' 34052 1727204420.82040: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204420.82045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.82049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.82051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204420.82054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.82099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.82190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204420.84173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204420.84219: stderr chunk (state=3): >>><<< 34052 1727204420.84223: stdout chunk (state=3): >>><<< 34052 1727204420.84238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204420.84241: _low_level_execute_command(): starting 34052 1727204420.84247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/AnsiballZ_dnf.py && sleep 0' 34052 1727204420.84778: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204420.84782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.84785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204420.84787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204420.84789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204420.84827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204420.84846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204420.84908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.00108: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34052 1727204422.04979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204422.04984: stderr chunk (state=3): >>><<< 34052 1727204422.04986: stdout chunk (state=3): >>><<< 34052 1727204422.05174: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204422.05179: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204422.05187: _low_level_execute_command(): starting 34052 1727204422.05190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204420.5666723-34487-223028638669407/ > /dev/null 2>&1 && sleep 0' 34052 1727204422.05725: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.05742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.05746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.05762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.05776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204422.05784: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204422.05793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.05807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204422.05815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204422.05822: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204422.05851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.05854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.05857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.05859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204422.05875: stderr chunk (state=3): >>>debug2: match found <<< 34052 1727204422.05883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.05960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.05969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.05990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.06068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.08284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.08288: stdout chunk (state=3): >>><<< 34052 1727204422.08290: stderr chunk (state=3): >>><<< 34052 1727204422.08293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.08296: handler run complete 34052 1727204422.08435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204422.08659: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204422.08702: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204422.08736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204422.08779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204422.08870: variable '__install_status' from source: unknown 34052 1727204422.08893: Evaluated conditional (__install_status is success): True 34052 1727204422.08912: attempt loop complete, returning result 34052 1727204422.08915: _execute() done 34052 1727204422.08918: dumping result to json 34052 1727204422.08924: done dumping result, returning 34052 1727204422.08937: done running TaskExecutor() for managed-node1/TASK: Install iproute [127b8e07-fff9-66a4-e2a3-000000000159] 34052 1727204422.08942: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000159 34052 1727204422.09062: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000159 34052 1727204422.09067: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34052 1727204422.09179: no more pending results, returning what we have 34052 1727204422.09183: results queue empty 34052 1727204422.09184: checking for any_errors_fatal 34052 1727204422.09191: done checking for any_errors_fatal 34052 1727204422.09192: checking for max_fail_percentage 34052 1727204422.09193: done checking for max_fail_percentage 34052 1727204422.09194: checking to see if all hosts have failed and the running result is not ok 34052 1727204422.09195: done checking to see if all hosts have failed 34052 1727204422.09196: getting the remaining hosts for this loop 34052 1727204422.09198: done getting the remaining hosts for this loop 34052 1727204422.09203: getting the next task for host managed-node1 34052 1727204422.09211: done getting next task for host managed-node1 34052 1727204422.09214: ^ task is: TASK: Create veth interface {{ interface }} 34052 1727204422.09218: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204422.09222: getting variables 34052 1727204422.09224: in VariableManager get_vars() 34052 1727204422.09379: Calling all_inventory to load vars for managed-node1 34052 1727204422.09383: Calling groups_inventory to load vars for managed-node1 34052 1727204422.09386: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204422.09398: Calling all_plugins_play to load vars for managed-node1 34052 1727204422.09401: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204422.09405: Calling groups_plugins_play to load vars for managed-node1 34052 1727204422.09895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204422.10175: done with get_vars() 34052 1727204422.10188: done getting variables 34052 1727204422.10260: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204422.10403: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:22 -0400 (0:00:01.622) 0:00:08.421 ***** 34052 1727204422.10457: entering _queue_task() for managed-node1/command 34052 1727204422.10917: worker is 1 (out of 1 available) 34052 1727204422.10932: exiting _queue_task() for managed-node1/command 34052 1727204422.10947: done queuing things up, now waiting for results queue to drain 34052 1727204422.10948: waiting for pending results... 34052 1727204422.11322: running TaskExecutor() for managed-node1/TASK: Create veth interface veth0 34052 1727204422.11330: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015a 34052 1727204422.11341: variable 'ansible_search_path' from source: unknown 34052 1727204422.11344: variable 'ansible_search_path' from source: unknown 34052 1727204422.11676: variable 'interface' from source: play vars 34052 1727204422.11804: variable 'interface' from source: play vars 34052 1727204422.11856: variable 'interface' from source: play vars 34052 1727204422.12149: Loaded config def from plugin (lookup/items) 34052 1727204422.12162: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34052 1727204422.12186: variable 'omit' from source: magic vars 34052 1727204422.12322: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.12337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.12353: variable 'omit' from source: magic vars 34052 1727204422.12760: variable 'ansible_distribution_major_version' from source: facts 34052 1727204422.12764: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204422.13022: variable 'type' from source: play vars 34052 1727204422.13025: variable 'state' from source: include params 34052 1727204422.13032: variable 'interface' from source: play vars 34052 1727204422.13036: variable 'current_interfaces' from source: set_fact 34052 1727204422.13043: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34052 1727204422.13051: variable 'omit' from source: magic vars 34052 1727204422.13212: variable 'omit' from source: magic vars 34052 1727204422.13268: variable 'item' from source: unknown 34052 1727204422.13361: variable 'item' from source: unknown 34052 1727204422.13381: variable 'omit' from source: magic vars 34052 1727204422.13426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204422.13464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204422.13488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204422.13514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204422.13572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204422.13575: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204422.13577: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.13580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.13682: Set connection var ansible_connection to ssh 34052 1727204422.13691: Set connection var ansible_timeout to 10 34052 1727204422.13698: Set connection var ansible_pipelining to False 34052 1727204422.13701: Set connection var ansible_shell_type to sh 34052 1727204422.13709: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204422.13718: Set connection var ansible_shell_executable to /bin/sh 34052 1727204422.13770: variable 'ansible_shell_executable' from source: unknown 34052 1727204422.13773: variable 'ansible_connection' from source: unknown 34052 1727204422.13775: variable 'ansible_module_compression' from source: unknown 34052 1727204422.13778: variable 'ansible_shell_type' from source: unknown 34052 1727204422.13780: variable 'ansible_shell_executable' from source: unknown 34052 1727204422.13782: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.13784: variable 'ansible_pipelining' from source: unknown 34052 1727204422.13786: variable 'ansible_timeout' from source: unknown 34052 1727204422.13788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.13971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204422.13976: variable 'omit' from source: magic vars 34052 1727204422.13979: starting attempt loop 34052 1727204422.13981: running the handler 34052 1727204422.13983: _low_level_execute_command(): starting 34052 1727204422.13986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204422.14919: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.14979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.15045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.16818: stdout chunk (state=3): >>>/root <<< 34052 1727204422.17036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.17040: stdout chunk (state=3): >>><<< 34052 1727204422.17043: stderr chunk (state=3): >>><<< 34052 1727204422.17179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.17183: _low_level_execute_command(): starting 34052 1727204422.17187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456 `" && echo ansible-tmp-1727204422.1707335-34698-265244418087456="` echo /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456 `" ) && sleep 0' 34052 1727204422.17889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.17934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.17947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.17971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.18060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.20180: stdout chunk (state=3): >>>ansible-tmp-1727204422.1707335-34698-265244418087456=/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456 <<< 34052 1727204422.20410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.20414: stdout chunk (state=3): >>><<< 34052 1727204422.20417: stderr chunk (state=3): >>><<< 34052 1727204422.20472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204422.1707335-34698-265244418087456=/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.20494: variable 'ansible_module_compression' from source: unknown 34052 1727204422.20722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204422.20727: variable 'ansible_facts' from source: unknown 34052 1727204422.21049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py 34052 1727204422.21629: Sending initial data 34052 1727204422.21634: Sent initial data (156 bytes) 34052 1727204422.22977: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.22981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.23362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.25174: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204422.25179: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34052 1727204422.25181: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 34052 1727204422.25183: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 34052 1727204422.25185: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 34052 1727204422.25186: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 34052 1727204422.25188: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 34052 1727204422.25190: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 34052 1727204422.25191: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 34052 1727204422.25193: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 34052 1727204422.25202: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 34052 1727204422.25204: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204422.25238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204422.25285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmptqa54ghe /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py <<< 34052 1727204422.25295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py" <<< 34052 1727204422.25378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmptqa54ghe" to remote "/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py" <<< 34052 1727204422.26611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.26615: stdout chunk (state=3): >>><<< 34052 1727204422.26618: stderr chunk (state=3): >>><<< 34052 1727204422.26620: done transferring module to remote 34052 1727204422.26622: _low_level_execute_command(): starting 34052 1727204422.26625: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/ /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py && sleep 0' 34052 1727204422.27783: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.28000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.28013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.28038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.28072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204422.28106: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204422.28117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.28136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204422.28245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204422.28250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204422.28252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.28255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.28257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.28259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204422.28261: stderr chunk (state=3): >>>debug2: match found <<< 34052 1727204422.28263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.28392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.28493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.28496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.30485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.30556: stderr chunk (state=3): >>><<< 34052 1727204422.30792: stdout chunk (state=3): >>><<< 34052 1727204422.30796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.30799: _low_level_execute_command(): starting 34052 1727204422.30802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/AnsiballZ_command.py && sleep 0' 34052 1727204422.32002: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.32189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.32210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.32276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.32289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.32326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.32460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.50974: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 15:00:22.497039", "end": "2024-09-24 15:00:22.505534", "delta": "0:00:00.008495", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204422.53631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204422.53635: stdout chunk (state=3): >>><<< 34052 1727204422.53647: stderr chunk (state=3): >>><<< 34052 1727204422.53663: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 15:00:22.497039", "end": "2024-09-24 15:00:22.505534", "delta": "0:00:00.008495", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204422.53975: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204422.53983: _low_level_execute_command(): starting 34052 1727204422.53986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204422.1707335-34698-265244418087456/ > /dev/null 2>&1 && sleep 0' 34052 1727204422.55374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.55383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.55394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.55410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.55422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204422.55574: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.55836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.55852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.56086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.61406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.61828: stderr chunk (state=3): >>><<< 34052 1727204422.61832: stdout chunk (state=3): >>><<< 34052 1727204422.61834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.61837: handler run complete 34052 1727204422.61839: Evaluated conditional (False): False 34052 1727204422.61841: attempt loop complete, returning result 34052 1727204422.61842: variable 'item' from source: unknown 34052 1727204422.61844: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.008495", "end": "2024-09-24 15:00:22.505534", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-24 15:00:22.497039" } 34052 1727204422.62226: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.62229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.62232: variable 'omit' from source: magic vars 34052 1727204422.62556: variable 'ansible_distribution_major_version' from source: facts 34052 1727204422.62560: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204422.62591: variable 'type' from source: play vars 34052 1727204422.62595: variable 'state' from source: include params 34052 1727204422.62597: variable 'interface' from source: play vars 34052 1727204422.62618: variable 'current_interfaces' from source: set_fact 34052 1727204422.62621: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34052 1727204422.62623: variable 'omit' from source: magic vars 34052 1727204422.62635: variable 'omit' from source: magic vars 34052 1727204422.62678: variable 'item' from source: unknown 34052 1727204422.62873: variable 'item' from source: unknown 34052 1727204422.62878: variable 'omit' from source: magic vars 34052 1727204422.62881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204422.62883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204422.62886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204422.62888: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204422.62890: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.62893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.62923: Set connection var ansible_connection to ssh 34052 1727204422.62926: Set connection var ansible_timeout to 10 34052 1727204422.62929: Set connection var ansible_pipelining to False 34052 1727204422.62931: Set connection var ansible_shell_type to sh 34052 1727204422.62933: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204422.62953: Set connection var ansible_shell_executable to /bin/sh 34052 1727204422.62971: variable 'ansible_shell_executable' from source: unknown 34052 1727204422.62975: variable 'ansible_connection' from source: unknown 34052 1727204422.62982: variable 'ansible_module_compression' from source: unknown 34052 1727204422.62985: variable 'ansible_shell_type' from source: unknown 34052 1727204422.62987: variable 'ansible_shell_executable' from source: unknown 34052 1727204422.62989: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204422.62991: variable 'ansible_pipelining' from source: unknown 34052 1727204422.62993: variable 'ansible_timeout' from source: unknown 34052 1727204422.62995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204422.63199: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204422.63207: variable 'omit' from source: magic vars 34052 1727204422.63211: starting attempt loop 34052 1727204422.63213: running the handler 34052 1727204422.63215: _low_level_execute_command(): starting 34052 1727204422.63217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204422.63819: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.63828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.63842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.63880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.63962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.63975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.63983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.63997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.64083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.65839: stdout chunk (state=3): >>>/root <<< 34052 1727204422.65990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.66079: stderr chunk (state=3): >>><<< 34052 1727204422.66248: stdout chunk (state=3): >>><<< 34052 1727204422.66252: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.66254: _low_level_execute_command(): starting 34052 1727204422.66257: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662 `" && echo ansible-tmp-1727204422.6616857-34698-252025647232662="` echo /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662 `" ) && sleep 0' 34052 1727204422.67427: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204422.67437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.67473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204422.67479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204422.67552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.67556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.67562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.67590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.67672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.69999: stdout chunk (state=3): >>>ansible-tmp-1727204422.6616857-34698-252025647232662=/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662 <<< 34052 1727204422.70013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.70178: stderr chunk (state=3): >>><<< 34052 1727204422.70182: stdout chunk (state=3): >>><<< 34052 1727204422.70185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204422.6616857-34698-252025647232662=/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.70187: variable 'ansible_module_compression' from source: unknown 34052 1727204422.70190: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204422.70219: variable 'ansible_facts' from source: unknown 34052 1727204422.70310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py 34052 1727204422.70487: Sending initial data 34052 1727204422.70496: Sent initial data (156 bytes) 34052 1727204422.71668: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.71692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.71856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.73554: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204422.73641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204422.73715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpnbqjf3j9 /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py <<< 34052 1727204422.73724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py" <<< 34052 1727204422.73798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpnbqjf3j9" to remote "/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py" <<< 34052 1727204422.74853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.75040: stdout chunk (state=3): >>><<< 34052 1727204422.75044: stderr chunk (state=3): >>><<< 34052 1727204422.75046: done transferring module to remote 34052 1727204422.75049: _low_level_execute_command(): starting 34052 1727204422.75051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/ /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py && sleep 0' 34052 1727204422.76041: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.76058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.76080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.76145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.78180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204422.78197: stderr chunk (state=3): >>><<< 34052 1727204422.78206: stdout chunk (state=3): >>><<< 34052 1727204422.78235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204422.78244: _low_level_execute_command(): starting 34052 1727204422.78255: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/AnsiballZ_command.py && sleep 0' 34052 1727204422.78949: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.79074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.79100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.79200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204422.96842: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 15:00:22.963029", "end": "2024-09-24 15:00:22.967223", "delta": "0:00:00.004194", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204422.98796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204422.98800: stdout chunk (state=3): >>><<< 34052 1727204422.98803: stderr chunk (state=3): >>><<< 34052 1727204422.98806: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 15:00:22.963029", "end": "2024-09-24 15:00:22.967223", "delta": "0:00:00.004194", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204422.98814: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204422.98816: _low_level_execute_command(): starting 34052 1727204422.98819: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204422.6616857-34698-252025647232662/ > /dev/null 2>&1 && sleep 0' 34052 1727204422.99419: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204422.99483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204422.99551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204422.99573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204422.99597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204422.99691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.01776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.01791: stderr chunk (state=3): >>><<< 34052 1727204423.01799: stdout chunk (state=3): >>><<< 34052 1727204423.01821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.01836: handler run complete 34052 1727204423.01863: Evaluated conditional (False): False 34052 1727204423.01881: attempt loop complete, returning result 34052 1727204423.01904: variable 'item' from source: unknown 34052 1727204423.01998: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.004194", "end": "2024-09-24 15:00:22.967223", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-24 15:00:22.963029" } 34052 1727204423.02289: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.02293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.02295: variable 'omit' from source: magic vars 34052 1727204423.02445: variable 'ansible_distribution_major_version' from source: facts 34052 1727204423.02471: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204423.02679: variable 'type' from source: play vars 34052 1727204423.02724: variable 'state' from source: include params 34052 1727204423.02729: variable 'interface' from source: play vars 34052 1727204423.02732: variable 'current_interfaces' from source: set_fact 34052 1727204423.02734: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34052 1727204423.02736: variable 'omit' from source: magic vars 34052 1727204423.02742: variable 'omit' from source: magic vars 34052 1727204423.02790: variable 'item' from source: unknown 34052 1727204423.02870: variable 'item' from source: unknown 34052 1727204423.02942: variable 'omit' from source: magic vars 34052 1727204423.02945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204423.02948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204423.02950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204423.02957: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204423.02964: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.02973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.03064: Set connection var ansible_connection to ssh 34052 1727204423.03078: Set connection var ansible_timeout to 10 34052 1727204423.03088: Set connection var ansible_pipelining to False 34052 1727204423.03094: Set connection var ansible_shell_type to sh 34052 1727204423.03105: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204423.03116: Set connection var ansible_shell_executable to /bin/sh 34052 1727204423.03147: variable 'ansible_shell_executable' from source: unknown 34052 1727204423.03268: variable 'ansible_connection' from source: unknown 34052 1727204423.03271: variable 'ansible_module_compression' from source: unknown 34052 1727204423.03275: variable 'ansible_shell_type' from source: unknown 34052 1727204423.03277: variable 'ansible_shell_executable' from source: unknown 34052 1727204423.03279: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.03281: variable 'ansible_pipelining' from source: unknown 34052 1727204423.03283: variable 'ansible_timeout' from source: unknown 34052 1727204423.03285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.03308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204423.03321: variable 'omit' from source: magic vars 34052 1727204423.03338: starting attempt loop 34052 1727204423.03378: running the handler 34052 1727204423.03389: _low_level_execute_command(): starting 34052 1727204423.03397: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204423.05142: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204423.05589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204423.05703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.06189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.06280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.08175: stdout chunk (state=3): >>>/root <<< 34052 1727204423.08209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.08213: stdout chunk (state=3): >>><<< 34052 1727204423.08222: stderr chunk (state=3): >>><<< 34052 1727204423.08240: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.08250: _low_level_execute_command(): starting 34052 1727204423.08256: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105 `" && echo ansible-tmp-1727204423.0823984-34698-46884214256105="` echo /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105 `" ) && sleep 0' 34052 1727204423.09029: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.09034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.09053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204423.09057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.09145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.09390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.09451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.11543: stdout chunk (state=3): >>>ansible-tmp-1727204423.0823984-34698-46884214256105=/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105 <<< 34052 1727204423.11964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.11971: stderr chunk (state=3): >>><<< 34052 1727204423.11977: stdout chunk (state=3): >>><<< 34052 1727204423.12188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204423.0823984-34698-46884214256105=/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.12210: variable 'ansible_module_compression' from source: unknown 34052 1727204423.12248: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204423.12269: variable 'ansible_facts' from source: unknown 34052 1727204423.12322: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py 34052 1727204423.12672: Sending initial data 34052 1727204423.12676: Sent initial data (155 bytes) 34052 1727204423.13889: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204423.13973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.13979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204423.13982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204423.13984: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204423.13986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.13992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204423.13995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204423.14074: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.14175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.14189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.14278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.16088: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204423.16092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204423.16141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpg3377ump /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py <<< 34052 1727204423.16145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py" <<< 34052 1727204423.16501: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpg3377ump" to remote "/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py" <<< 34052 1727204423.17880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.17885: stderr chunk (state=3): >>><<< 34052 1727204423.17888: stdout chunk (state=3): >>><<< 34052 1727204423.17913: done transferring module to remote 34052 1727204423.17922: _low_level_execute_command(): starting 34052 1727204423.17929: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/ /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py && sleep 0' 34052 1727204423.19387: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204423.19395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.19483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.19607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.21764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.21887: stderr chunk (state=3): >>><<< 34052 1727204423.21891: stdout chunk (state=3): >>><<< 34052 1727204423.21908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.21911: _low_level_execute_command(): starting 34052 1727204423.21971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/AnsiballZ_command.py && sleep 0' 34052 1727204423.22987: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204423.23190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204423.23201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.23218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204423.23231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204423.23620: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.23624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.23629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.41464: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 15:00:23.409299", "end": "2024-09-24 15:00:23.413362", "delta": "0:00:00.004063", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204423.43291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.43399: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204423.43521: stdout chunk (state=3): >>><<< 34052 1727204423.43535: stderr chunk (state=3): >>><<< 34052 1727204423.43549: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 15:00:23.409299", "end": "2024-09-24 15:00:23.413362", "delta": "0:00:00.004063", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204423.43586: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204423.43671: _low_level_execute_command(): starting 34052 1727204423.43675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204423.0823984-34698-46884214256105/ > /dev/null 2>&1 && sleep 0' 34052 1727204423.45146: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.45150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34052 1727204423.45152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.45155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.45387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.45604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.47737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.47789: stderr chunk (state=3): >>><<< 34052 1727204423.47792: stdout chunk (state=3): >>><<< 34052 1727204423.47815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.47818: handler run complete 34052 1727204423.47843: Evaluated conditional (False): False 34052 1727204423.47852: attempt loop complete, returning result 34052 1727204423.47984: variable 'item' from source: unknown 34052 1727204423.48068: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004063", "end": "2024-09-24 15:00:23.413362", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-24 15:00:23.409299" } 34052 1727204423.48766: dumping result to json 34052 1727204423.48770: done dumping result, returning 34052 1727204423.48773: done running TaskExecutor() for managed-node1/TASK: Create veth interface veth0 [127b8e07-fff9-66a4-e2a3-00000000015a] 34052 1727204423.48775: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015a 34052 1727204423.48836: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015a 34052 1727204423.48839: WORKER PROCESS EXITING 34052 1727204423.48934: no more pending results, returning what we have 34052 1727204423.48937: results queue empty 34052 1727204423.48938: checking for any_errors_fatal 34052 1727204423.48943: done checking for any_errors_fatal 34052 1727204423.48944: checking for max_fail_percentage 34052 1727204423.48945: done checking for max_fail_percentage 34052 1727204423.48946: checking to see if all hosts have failed and the running result is not ok 34052 1727204423.48946: done checking to see if all hosts have failed 34052 1727204423.48947: getting the remaining hosts for this loop 34052 1727204423.48948: done getting the remaining hosts for this loop 34052 1727204423.48952: getting the next task for host managed-node1 34052 1727204423.48957: done getting next task for host managed-node1 34052 1727204423.48959: ^ task is: TASK: Set up veth as managed by NetworkManager 34052 1727204423.48962: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204423.49170: getting variables 34052 1727204423.49174: in VariableManager get_vars() 34052 1727204423.49206: Calling all_inventory to load vars for managed-node1 34052 1727204423.49210: Calling groups_inventory to load vars for managed-node1 34052 1727204423.49212: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204423.49223: Calling all_plugins_play to load vars for managed-node1 34052 1727204423.49229: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204423.49232: Calling groups_plugins_play to load vars for managed-node1 34052 1727204423.49662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204423.50122: done with get_vars() 34052 1727204423.50139: done getting variables 34052 1727204423.50210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:23 -0400 (0:00:01.397) 0:00:09.819 ***** 34052 1727204423.50246: entering _queue_task() for managed-node1/command 34052 1727204423.51215: worker is 1 (out of 1 available) 34052 1727204423.51233: exiting _queue_task() for managed-node1/command 34052 1727204423.51248: done queuing things up, now waiting for results queue to drain 34052 1727204423.51250: waiting for pending results... 34052 1727204423.51707: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 34052 1727204423.51773: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015b 34052 1727204423.51909: variable 'ansible_search_path' from source: unknown 34052 1727204423.51914: variable 'ansible_search_path' from source: unknown 34052 1727204423.51918: calling self._execute() 34052 1727204423.51961: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.51975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.51988: variable 'omit' from source: magic vars 34052 1727204423.52399: variable 'ansible_distribution_major_version' from source: facts 34052 1727204423.52421: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204423.52612: variable 'type' from source: play vars 34052 1727204423.52624: variable 'state' from source: include params 34052 1727204423.52634: Evaluated conditional (type == 'veth' and state == 'present'): True 34052 1727204423.52646: variable 'omit' from source: magic vars 34052 1727204423.52701: variable 'omit' from source: magic vars 34052 1727204423.52822: variable 'interface' from source: play vars 34052 1727204423.52847: variable 'omit' from source: magic vars 34052 1727204423.52901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204423.52947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204423.52976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204423.53051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204423.53109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204423.53113: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204423.53121: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.53129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.53244: Set connection var ansible_connection to ssh 34052 1727204423.53256: Set connection var ansible_timeout to 10 34052 1727204423.53268: Set connection var ansible_pipelining to False 34052 1727204423.53326: Set connection var ansible_shell_type to sh 34052 1727204423.53329: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204423.53331: Set connection var ansible_shell_executable to /bin/sh 34052 1727204423.53334: variable 'ansible_shell_executable' from source: unknown 34052 1727204423.53335: variable 'ansible_connection' from source: unknown 34052 1727204423.53337: variable 'ansible_module_compression' from source: unknown 34052 1727204423.53339: variable 'ansible_shell_type' from source: unknown 34052 1727204423.53340: variable 'ansible_shell_executable' from source: unknown 34052 1727204423.53346: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.53353: variable 'ansible_pipelining' from source: unknown 34052 1727204423.53358: variable 'ansible_timeout' from source: unknown 34052 1727204423.53364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.53693: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204423.53976: variable 'omit' from source: magic vars 34052 1727204423.53980: starting attempt loop 34052 1727204423.53983: running the handler 34052 1727204423.53985: _low_level_execute_command(): starting 34052 1727204423.53988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204423.54751: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204423.54784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.54859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.54908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204423.54933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.54961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.55086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.56846: stdout chunk (state=3): >>>/root <<< 34052 1727204423.57039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.57065: stderr chunk (state=3): >>><<< 34052 1727204423.57071: stdout chunk (state=3): >>><<< 34052 1727204423.57097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.57124: _low_level_execute_command(): starting 34052 1727204423.57131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775 `" && echo ansible-tmp-1727204423.5709198-34752-100041615736775="` echo /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775 `" ) && sleep 0' 34052 1727204423.57834: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.57847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204423.57907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.57910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.57957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.60013: stdout chunk (state=3): >>>ansible-tmp-1727204423.5709198-34752-100041615736775=/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775 <<< 34052 1727204423.60138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.60197: stderr chunk (state=3): >>><<< 34052 1727204423.60201: stdout chunk (state=3): >>><<< 34052 1727204423.60218: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204423.5709198-34752-100041615736775=/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.60250: variable 'ansible_module_compression' from source: unknown 34052 1727204423.60293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204423.60322: variable 'ansible_facts' from source: unknown 34052 1727204423.60387: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py 34052 1727204423.60548: Sending initial data 34052 1727204423.60556: Sent initial data (156 bytes) 34052 1727204423.61295: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.61349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204423.61371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.61397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.61489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.63213: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204423.63274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204423.63311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpvix5kf11 /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py <<< 34052 1727204423.63315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py" <<< 34052 1727204423.63394: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpvix5kf11" to remote "/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py" <<< 34052 1727204423.64251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.64340: stderr chunk (state=3): >>><<< 34052 1727204423.64344: stdout chunk (state=3): >>><<< 34052 1727204423.64373: done transferring module to remote 34052 1727204423.64383: _low_level_execute_command(): starting 34052 1727204423.64388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/ /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py && sleep 0' 34052 1727204423.65000: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.65026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.65099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.67158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.67162: stdout chunk (state=3): >>><<< 34052 1727204423.67165: stderr chunk (state=3): >>><<< 34052 1727204423.67170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.67178: _low_level_execute_command(): starting 34052 1727204423.67181: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/AnsiballZ_command.py && sleep 0' 34052 1727204423.67894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204423.67939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.68030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.87398: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 15:00:23.852437", "end": "2024-09-24 15:00:23.872660", "delta": "0:00:00.020223", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204423.89143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204423.89201: stderr chunk (state=3): >>><<< 34052 1727204423.89206: stdout chunk (state=3): >>><<< 34052 1727204423.89222: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 15:00:23.852437", "end": "2024-09-24 15:00:23.872660", "delta": "0:00:00.020223", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204423.89257: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204423.89267: _low_level_execute_command(): starting 34052 1727204423.89273: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204423.5709198-34752-100041615736775/ > /dev/null 2>&1 && sleep 0' 34052 1727204423.89776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204423.89780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204423.89783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204423.89873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204423.89931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204423.91887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204423.91951: stderr chunk (state=3): >>><<< 34052 1727204423.91955: stdout chunk (state=3): >>><<< 34052 1727204423.91969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204423.91977: handler run complete 34052 1727204423.91996: Evaluated conditional (False): False 34052 1727204423.92008: attempt loop complete, returning result 34052 1727204423.92011: _execute() done 34052 1727204423.92014: dumping result to json 34052 1727204423.92019: done dumping result, returning 34052 1727204423.92035: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-66a4-e2a3-00000000015b] 34052 1727204423.92040: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015b 34052 1727204423.92143: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015b 34052 1727204423.92145: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.020223", "end": "2024-09-24 15:00:23.872660", "rc": 0, "start": "2024-09-24 15:00:23.852437" } 34052 1727204423.92215: no more pending results, returning what we have 34052 1727204423.92218: results queue empty 34052 1727204423.92219: checking for any_errors_fatal 34052 1727204423.92232: done checking for any_errors_fatal 34052 1727204423.92233: checking for max_fail_percentage 34052 1727204423.92234: done checking for max_fail_percentage 34052 1727204423.92235: checking to see if all hosts have failed and the running result is not ok 34052 1727204423.92236: done checking to see if all hosts have failed 34052 1727204423.92237: getting the remaining hosts for this loop 34052 1727204423.92239: done getting the remaining hosts for this loop 34052 1727204423.92243: getting the next task for host managed-node1 34052 1727204423.92249: done getting next task for host managed-node1 34052 1727204423.92251: ^ task is: TASK: Delete veth interface {{ interface }} 34052 1727204423.92254: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204423.92258: getting variables 34052 1727204423.92260: in VariableManager get_vars() 34052 1727204423.92301: Calling all_inventory to load vars for managed-node1 34052 1727204423.92304: Calling groups_inventory to load vars for managed-node1 34052 1727204423.92306: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204423.92317: Calling all_plugins_play to load vars for managed-node1 34052 1727204423.92319: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204423.92322: Calling groups_plugins_play to load vars for managed-node1 34052 1727204423.92513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204423.92673: done with get_vars() 34052 1727204423.92688: done getting variables 34052 1727204423.92750: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204423.93284: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:23 -0400 (0:00:00.430) 0:00:10.250 ***** 34052 1727204423.93318: entering _queue_task() for managed-node1/command 34052 1727204423.93691: worker is 1 (out of 1 available) 34052 1727204423.93818: exiting _queue_task() for managed-node1/command 34052 1727204423.93837: done queuing things up, now waiting for results queue to drain 34052 1727204423.93839: waiting for pending results... 34052 1727204423.94160: running TaskExecutor() for managed-node1/TASK: Delete veth interface veth0 34052 1727204423.94167: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015c 34052 1727204423.94183: variable 'ansible_search_path' from source: unknown 34052 1727204423.94190: variable 'ansible_search_path' from source: unknown 34052 1727204423.94238: calling self._execute() 34052 1727204423.94347: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.94368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.94384: variable 'omit' from source: magic vars 34052 1727204423.94832: variable 'ansible_distribution_major_version' from source: facts 34052 1727204423.94852: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204423.95104: variable 'type' from source: play vars 34052 1727204423.95124: variable 'state' from source: include params 34052 1727204423.95168: variable 'interface' from source: play vars 34052 1727204423.95172: variable 'current_interfaces' from source: set_fact 34052 1727204423.95175: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 34052 1727204423.95178: when evaluation is False, skipping this task 34052 1727204423.95180: _execute() done 34052 1727204423.95182: dumping result to json 34052 1727204423.95184: done dumping result, returning 34052 1727204423.95187: done running TaskExecutor() for managed-node1/TASK: Delete veth interface veth0 [127b8e07-fff9-66a4-e2a3-00000000015c] 34052 1727204423.95193: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015c 34052 1727204423.95320: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015c 34052 1727204423.95323: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204423.95403: no more pending results, returning what we have 34052 1727204423.95407: results queue empty 34052 1727204423.95408: checking for any_errors_fatal 34052 1727204423.95418: done checking for any_errors_fatal 34052 1727204423.95419: checking for max_fail_percentage 34052 1727204423.95421: done checking for max_fail_percentage 34052 1727204423.95421: checking to see if all hosts have failed and the running result is not ok 34052 1727204423.95422: done checking to see if all hosts have failed 34052 1727204423.95423: getting the remaining hosts for this loop 34052 1727204423.95428: done getting the remaining hosts for this loop 34052 1727204423.95433: getting the next task for host managed-node1 34052 1727204423.95439: done getting next task for host managed-node1 34052 1727204423.95442: ^ task is: TASK: Create dummy interface {{ interface }} 34052 1727204423.95446: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204423.95451: getting variables 34052 1727204423.95453: in VariableManager get_vars() 34052 1727204423.95502: Calling all_inventory to load vars for managed-node1 34052 1727204423.95506: Calling groups_inventory to load vars for managed-node1 34052 1727204423.95508: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204423.95523: Calling all_plugins_play to load vars for managed-node1 34052 1727204423.95529: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204423.95533: Calling groups_plugins_play to load vars for managed-node1 34052 1727204423.96053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204423.96314: done with get_vars() 34052 1727204423.96334: done getting variables 34052 1727204423.96400: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204423.96535: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:23 -0400 (0:00:00.032) 0:00:10.282 ***** 34052 1727204423.96574: entering _queue_task() for managed-node1/command 34052 1727204423.97020: worker is 1 (out of 1 available) 34052 1727204423.97036: exiting _queue_task() for managed-node1/command 34052 1727204423.97049: done queuing things up, now waiting for results queue to drain 34052 1727204423.97050: waiting for pending results... 34052 1727204423.97335: running TaskExecutor() for managed-node1/TASK: Create dummy interface veth0 34052 1727204423.97432: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015d 34052 1727204423.97436: variable 'ansible_search_path' from source: unknown 34052 1727204423.97439: variable 'ansible_search_path' from source: unknown 34052 1727204423.97467: calling self._execute() 34052 1727204423.97598: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204423.97645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204423.97648: variable 'omit' from source: magic vars 34052 1727204423.98073: variable 'ansible_distribution_major_version' from source: facts 34052 1727204423.98103: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204423.98417: variable 'type' from source: play vars 34052 1727204423.98468: variable 'state' from source: include params 34052 1727204423.98479: variable 'interface' from source: play vars 34052 1727204423.98490: variable 'current_interfaces' from source: set_fact 34052 1727204423.98494: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 34052 1727204423.98496: when evaluation is False, skipping this task 34052 1727204423.98499: _execute() done 34052 1727204423.98502: dumping result to json 34052 1727204423.98520: done dumping result, returning 34052 1727204423.98523: done running TaskExecutor() for managed-node1/TASK: Create dummy interface veth0 [127b8e07-fff9-66a4-e2a3-00000000015d] 34052 1727204423.98541: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015d 34052 1727204423.98714: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015d 34052 1727204423.98717: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204423.98794: no more pending results, returning what we have 34052 1727204423.98798: results queue empty 34052 1727204423.98799: checking for any_errors_fatal 34052 1727204423.98811: done checking for any_errors_fatal 34052 1727204423.98812: checking for max_fail_percentage 34052 1727204423.98814: done checking for max_fail_percentage 34052 1727204423.98814: checking to see if all hosts have failed and the running result is not ok 34052 1727204423.98815: done checking to see if all hosts have failed 34052 1727204423.98816: getting the remaining hosts for this loop 34052 1727204423.98818: done getting the remaining hosts for this loop 34052 1727204423.98823: getting the next task for host managed-node1 34052 1727204423.98833: done getting next task for host managed-node1 34052 1727204423.98836: ^ task is: TASK: Delete dummy interface {{ interface }} 34052 1727204423.98841: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204423.98845: getting variables 34052 1727204423.98847: in VariableManager get_vars() 34052 1727204423.98901: Calling all_inventory to load vars for managed-node1 34052 1727204423.98904: Calling groups_inventory to load vars for managed-node1 34052 1727204423.98907: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204423.99054: Calling all_plugins_play to load vars for managed-node1 34052 1727204423.99059: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204423.99074: Calling groups_plugins_play to load vars for managed-node1 34052 1727204423.99800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.00052: done with get_vars() 34052 1727204424.00066: done getting variables 34052 1727204424.00130: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204424.00367: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.038) 0:00:10.321 ***** 34052 1727204424.00403: entering _queue_task() for managed-node1/command 34052 1727204424.01074: worker is 1 (out of 1 available) 34052 1727204424.01092: exiting _queue_task() for managed-node1/command 34052 1727204424.01105: done queuing things up, now waiting for results queue to drain 34052 1727204424.01107: waiting for pending results... 34052 1727204424.01611: running TaskExecutor() for managed-node1/TASK: Delete dummy interface veth0 34052 1727204424.01794: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015e 34052 1727204424.01798: variable 'ansible_search_path' from source: unknown 34052 1727204424.01801: variable 'ansible_search_path' from source: unknown 34052 1727204424.01805: calling self._execute() 34052 1727204424.01899: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.01919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.02010: variable 'omit' from source: magic vars 34052 1727204424.02375: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.02394: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.02646: variable 'type' from source: play vars 34052 1727204424.02661: variable 'state' from source: include params 34052 1727204424.02675: variable 'interface' from source: play vars 34052 1727204424.02683: variable 'current_interfaces' from source: set_fact 34052 1727204424.02694: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 34052 1727204424.02700: when evaluation is False, skipping this task 34052 1727204424.02707: _execute() done 34052 1727204424.02714: dumping result to json 34052 1727204424.02720: done dumping result, returning 34052 1727204424.02733: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface veth0 [127b8e07-fff9-66a4-e2a3-00000000015e] 34052 1727204424.02743: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015e skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204424.03109: no more pending results, returning what we have 34052 1727204424.03113: results queue empty 34052 1727204424.03114: checking for any_errors_fatal 34052 1727204424.03119: done checking for any_errors_fatal 34052 1727204424.03120: checking for max_fail_percentage 34052 1727204424.03121: done checking for max_fail_percentage 34052 1727204424.03122: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.03123: done checking to see if all hosts have failed 34052 1727204424.03124: getting the remaining hosts for this loop 34052 1727204424.03127: done getting the remaining hosts for this loop 34052 1727204424.03131: getting the next task for host managed-node1 34052 1727204424.03137: done getting next task for host managed-node1 34052 1727204424.03140: ^ task is: TASK: Create tap interface {{ interface }} 34052 1727204424.03143: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.03147: getting variables 34052 1727204424.03148: in VariableManager get_vars() 34052 1727204424.03190: Calling all_inventory to load vars for managed-node1 34052 1727204424.03193: Calling groups_inventory to load vars for managed-node1 34052 1727204424.03196: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.03208: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.03211: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.03214: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.03640: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015e 34052 1727204424.03644: WORKER PROCESS EXITING 34052 1727204424.03672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.03933: done with get_vars() 34052 1727204424.03948: done getting variables 34052 1727204424.04017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204424.04157: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.037) 0:00:10.359 ***** 34052 1727204424.04196: entering _queue_task() for managed-node1/command 34052 1727204424.04575: worker is 1 (out of 1 available) 34052 1727204424.04703: exiting _queue_task() for managed-node1/command 34052 1727204424.04716: done queuing things up, now waiting for results queue to drain 34052 1727204424.04718: waiting for pending results... 34052 1727204424.05150: running TaskExecutor() for managed-node1/TASK: Create tap interface veth0 34052 1727204424.05470: in run() - task 127b8e07-fff9-66a4-e2a3-00000000015f 34052 1727204424.05492: variable 'ansible_search_path' from source: unknown 34052 1727204424.05499: variable 'ansible_search_path' from source: unknown 34052 1727204424.05546: calling self._execute() 34052 1727204424.05647: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.05659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.05673: variable 'omit' from source: magic vars 34052 1727204424.06475: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.06499: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.07121: variable 'type' from source: play vars 34052 1727204424.07138: variable 'state' from source: include params 34052 1727204424.07148: variable 'interface' from source: play vars 34052 1727204424.07157: variable 'current_interfaces' from source: set_fact 34052 1727204424.07223: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 34052 1727204424.07236: when evaluation is False, skipping this task 34052 1727204424.07244: _execute() done 34052 1727204424.07252: dumping result to json 34052 1727204424.07261: done dumping result, returning 34052 1727204424.07277: done running TaskExecutor() for managed-node1/TASK: Create tap interface veth0 [127b8e07-fff9-66a4-e2a3-00000000015f] 34052 1727204424.07543: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015f 34052 1727204424.07624: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000015f 34052 1727204424.07630: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204424.07706: no more pending results, returning what we have 34052 1727204424.07711: results queue empty 34052 1727204424.07712: checking for any_errors_fatal 34052 1727204424.07718: done checking for any_errors_fatal 34052 1727204424.07719: checking for max_fail_percentage 34052 1727204424.07721: done checking for max_fail_percentage 34052 1727204424.07722: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.07723: done checking to see if all hosts have failed 34052 1727204424.07723: getting the remaining hosts for this loop 34052 1727204424.07728: done getting the remaining hosts for this loop 34052 1727204424.07733: getting the next task for host managed-node1 34052 1727204424.07739: done getting next task for host managed-node1 34052 1727204424.07742: ^ task is: TASK: Delete tap interface {{ interface }} 34052 1727204424.07747: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.07752: getting variables 34052 1727204424.07754: in VariableManager get_vars() 34052 1727204424.07807: Calling all_inventory to load vars for managed-node1 34052 1727204424.07810: Calling groups_inventory to load vars for managed-node1 34052 1727204424.07813: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.07832: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.07836: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.07841: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.08978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.09652: done with get_vars() 34052 1727204424.09698: done getting variables 34052 1727204424.09893: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204424.10037: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.060) 0:00:10.420 ***** 34052 1727204424.10295: entering _queue_task() for managed-node1/command 34052 1727204424.10949: worker is 1 (out of 1 available) 34052 1727204424.10964: exiting _queue_task() for managed-node1/command 34052 1727204424.10983: done queuing things up, now waiting for results queue to drain 34052 1727204424.10985: waiting for pending results... 34052 1727204424.11798: running TaskExecutor() for managed-node1/TASK: Delete tap interface veth0 34052 1727204424.11805: in run() - task 127b8e07-fff9-66a4-e2a3-000000000160 34052 1727204424.11893: variable 'ansible_search_path' from source: unknown 34052 1727204424.12120: variable 'ansible_search_path' from source: unknown 34052 1727204424.12124: calling self._execute() 34052 1727204424.12295: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.12308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.12358: variable 'omit' from source: magic vars 34052 1727204424.13547: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.13593: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.14375: variable 'type' from source: play vars 34052 1727204424.14473: variable 'state' from source: include params 34052 1727204424.14478: variable 'interface' from source: play vars 34052 1727204424.14480: variable 'current_interfaces' from source: set_fact 34052 1727204424.14483: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 34052 1727204424.14486: when evaluation is False, skipping this task 34052 1727204424.14488: _execute() done 34052 1727204424.14490: dumping result to json 34052 1727204424.14493: done dumping result, returning 34052 1727204424.14495: done running TaskExecutor() for managed-node1/TASK: Delete tap interface veth0 [127b8e07-fff9-66a4-e2a3-000000000160] 34052 1727204424.14497: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000160 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204424.14640: no more pending results, returning what we have 34052 1727204424.14644: results queue empty 34052 1727204424.14645: checking for any_errors_fatal 34052 1727204424.14652: done checking for any_errors_fatal 34052 1727204424.14652: checking for max_fail_percentage 34052 1727204424.14654: done checking for max_fail_percentage 34052 1727204424.14654: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.14655: done checking to see if all hosts have failed 34052 1727204424.14656: getting the remaining hosts for this loop 34052 1727204424.14662: done getting the remaining hosts for this loop 34052 1727204424.14668: getting the next task for host managed-node1 34052 1727204424.14678: done getting next task for host managed-node1 34052 1727204424.14681: ^ task is: TASK: Set up gateway ip on veth peer 34052 1727204424.14684: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.14689: getting variables 34052 1727204424.14691: in VariableManager get_vars() 34052 1727204424.14743: Calling all_inventory to load vars for managed-node1 34052 1727204424.14746: Calling groups_inventory to load vars for managed-node1 34052 1727204424.14748: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.14772: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.14777: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.14782: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.15112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.15358: done with get_vars() 34052 1727204424.15374: done getting variables 34052 1727204424.15413: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000160 34052 1727204424.15416: WORKER PROCESS EXITING 34052 1727204424.15569: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.053) 0:00:10.473 ***** 34052 1727204424.15602: entering _queue_task() for managed-node1/shell 34052 1727204424.15604: Creating lock for shell 34052 1727204424.16084: worker is 1 (out of 1 available) 34052 1727204424.16122: exiting _queue_task() for managed-node1/shell 34052 1727204424.16137: done queuing things up, now waiting for results queue to drain 34052 1727204424.16139: waiting for pending results... 34052 1727204424.16613: running TaskExecutor() for managed-node1/TASK: Set up gateway ip on veth peer 34052 1727204424.16892: in run() - task 127b8e07-fff9-66a4-e2a3-00000000000d 34052 1727204424.16933: variable 'ansible_search_path' from source: unknown 34052 1727204424.16990: calling self._execute() 34052 1727204424.17108: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.17121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.17141: variable 'omit' from source: magic vars 34052 1727204424.17670: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.17695: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.17707: variable 'omit' from source: magic vars 34052 1727204424.17746: variable 'omit' from source: magic vars 34052 1727204424.17918: variable 'interface' from source: play vars 34052 1727204424.17947: variable 'omit' from source: magic vars 34052 1727204424.18003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204424.18060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204424.18091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204424.18133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.18151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.18194: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204424.18203: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.18210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.18343: Set connection var ansible_connection to ssh 34052 1727204424.18359: Set connection var ansible_timeout to 10 34052 1727204424.18375: Set connection var ansible_pipelining to False 34052 1727204424.18383: Set connection var ansible_shell_type to sh 34052 1727204424.18396: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204424.18410: Set connection var ansible_shell_executable to /bin/sh 34052 1727204424.18450: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.18462: variable 'ansible_connection' from source: unknown 34052 1727204424.18472: variable 'ansible_module_compression' from source: unknown 34052 1727204424.18479: variable 'ansible_shell_type' from source: unknown 34052 1727204424.18486: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.18493: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.18512: variable 'ansible_pipelining' from source: unknown 34052 1727204424.18524: variable 'ansible_timeout' from source: unknown 34052 1727204424.18536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.18740: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204424.18783: variable 'omit' from source: magic vars 34052 1727204424.18789: starting attempt loop 34052 1727204424.18792: running the handler 34052 1727204424.18798: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204424.18872: _low_level_execute_command(): starting 34052 1727204424.18876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204424.19796: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.19802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204424.19833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.19933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.21914: stdout chunk (state=3): >>>/root <<< 34052 1727204424.21919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.21922: stdout chunk (state=3): >>><<< 34052 1727204424.21925: stderr chunk (state=3): >>><<< 34052 1727204424.22299: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204424.22304: _low_level_execute_command(): starting 34052 1727204424.22308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850 `" && echo ansible-tmp-1727204424.2198937-34790-258955817576850="` echo /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850 `" ) && sleep 0' 34052 1727204424.23414: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.23441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204424.23458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204424.23485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204424.23497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204424.23500: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204424.23511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.23525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204424.23538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204424.23563: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204424.23604: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.23661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.23721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.23765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.25999: stdout chunk (state=3): >>>ansible-tmp-1727204424.2198937-34790-258955817576850=/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850 <<< 34052 1727204424.26004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.26394: stderr chunk (state=3): >>><<< 34052 1727204424.26398: stdout chunk (state=3): >>><<< 34052 1727204424.26401: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204424.2198937-34790-258955817576850=/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204424.26403: variable 'ansible_module_compression' from source: unknown 34052 1727204424.26405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204424.26438: variable 'ansible_facts' from source: unknown 34052 1727204424.26541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py 34052 1727204424.26797: Sending initial data 34052 1727204424.26808: Sent initial data (156 bytes) 34052 1727204424.27380: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.27397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204424.27411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204424.27531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.27546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204424.27564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.27646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.29284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204424.29355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204424.29402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp6uncvz1j /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py <<< 34052 1727204424.29428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py" <<< 34052 1727204424.29474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp6uncvz1j" to remote "/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py" <<< 34052 1727204424.30326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.30428: stderr chunk (state=3): >>><<< 34052 1727204424.30432: stdout chunk (state=3): >>><<< 34052 1727204424.30525: done transferring module to remote 34052 1727204424.30528: _low_level_execute_command(): starting 34052 1727204424.30531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/ /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py && sleep 0' 34052 1727204424.31206: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.31227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204424.31303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.31382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.31476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.33601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.33619: stdout chunk (state=3): >>><<< 34052 1727204424.33632: stderr chunk (state=3): >>><<< 34052 1727204424.33653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204424.33661: _low_level_execute_command(): starting 34052 1727204424.33709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/AnsiballZ_command.py && sleep 0' 34052 1727204424.34393: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.34409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204424.34499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.34545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.34580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.34661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.53820: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 15:00:24.514410", "end": "2024-09-24 15:00:24.536182", "delta": "0:00:00.021772", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204424.55597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204424.56057: stderr chunk (state=3): >>><<< 34052 1727204424.56061: stdout chunk (state=3): >>><<< 34052 1727204424.56064: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 15:00:24.514410", "end": "2024-09-24 15:00:24.536182", "delta": "0:00:00.021772", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204424.56069: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204424.56071: _low_level_execute_command(): starting 34052 1727204424.56073: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204424.2198937-34790-258955817576850/ > /dev/null 2>&1 && sleep 0' 34052 1727204424.57294: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204424.57523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.57555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.57596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.57640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.60020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.60024: stdout chunk (state=3): >>><<< 34052 1727204424.60027: stderr chunk (state=3): >>><<< 34052 1727204424.60030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204424.60033: handler run complete 34052 1727204424.60035: Evaluated conditional (False): False 34052 1727204424.60037: attempt loop complete, returning result 34052 1727204424.60039: _execute() done 34052 1727204424.60041: dumping result to json 34052 1727204424.60043: done dumping result, returning 34052 1727204424.60045: done running TaskExecutor() for managed-node1/TASK: Set up gateway ip on veth peer [127b8e07-fff9-66a4-e2a3-00000000000d] 34052 1727204424.60047: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000d ok: [managed-node1] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.021772", "end": "2024-09-24 15:00:24.536182", "rc": 0, "start": "2024-09-24 15:00:24.514410" } 34052 1727204424.60303: no more pending results, returning what we have 34052 1727204424.60306: results queue empty 34052 1727204424.60307: checking for any_errors_fatal 34052 1727204424.60311: done checking for any_errors_fatal 34052 1727204424.60312: checking for max_fail_percentage 34052 1727204424.60314: done checking for max_fail_percentage 34052 1727204424.60314: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.60315: done checking to see if all hosts have failed 34052 1727204424.60316: getting the remaining hosts for this loop 34052 1727204424.60317: done getting the remaining hosts for this loop 34052 1727204424.60322: getting the next task for host managed-node1 34052 1727204424.60329: done getting next task for host managed-node1 34052 1727204424.60331: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 34052 1727204424.60333: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.60342: getting variables 34052 1727204424.60344: in VariableManager get_vars() 34052 1727204424.60540: Calling all_inventory to load vars for managed-node1 34052 1727204424.60543: Calling groups_inventory to load vars for managed-node1 34052 1727204424.60546: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.60679: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.60683: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.60686: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.61006: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000d 34052 1727204424.61011: WORKER PROCESS EXITING 34052 1727204424.61044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.61561: done with get_vars() 34052 1727204424.61600: done getting variables 34052 1727204424.61805: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.462) 0:00:10.935 ***** 34052 1727204424.61839: entering _queue_task() for managed-node1/debug 34052 1727204424.62520: worker is 1 (out of 1 available) 34052 1727204424.62535: exiting _queue_task() for managed-node1/debug 34052 1727204424.62548: done queuing things up, now waiting for results queue to drain 34052 1727204424.62550: waiting for pending results... 34052 1727204424.62974: running TaskExecutor() for managed-node1/TASK: TEST: I can configure an interface with static ipv6 config 34052 1727204424.63234: in run() - task 127b8e07-fff9-66a4-e2a3-00000000000f 34052 1727204424.63302: variable 'ansible_search_path' from source: unknown 34052 1727204424.63576: calling self._execute() 34052 1727204424.63580: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.63691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.63709: variable 'omit' from source: magic vars 34052 1727204424.64528: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.64684: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.64773: variable 'omit' from source: magic vars 34052 1727204424.64777: variable 'omit' from source: magic vars 34052 1727204424.64779: variable 'omit' from source: magic vars 34052 1727204424.64923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204424.64977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204424.65007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204424.65035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.65051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.65097: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204424.65109: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.65117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.65239: Set connection var ansible_connection to ssh 34052 1727204424.65252: Set connection var ansible_timeout to 10 34052 1727204424.65263: Set connection var ansible_pipelining to False 34052 1727204424.65272: Set connection var ansible_shell_type to sh 34052 1727204424.65284: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204424.65300: Set connection var ansible_shell_executable to /bin/sh 34052 1727204424.65342: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.65351: variable 'ansible_connection' from source: unknown 34052 1727204424.65357: variable 'ansible_module_compression' from source: unknown 34052 1727204424.65364: variable 'ansible_shell_type' from source: unknown 34052 1727204424.65372: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.65379: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.65386: variable 'ansible_pipelining' from source: unknown 34052 1727204424.65392: variable 'ansible_timeout' from source: unknown 34052 1727204424.65400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.65648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204424.65652: variable 'omit' from source: magic vars 34052 1727204424.65655: starting attempt loop 34052 1727204424.65657: running the handler 34052 1727204424.65659: handler run complete 34052 1727204424.65686: attempt loop complete, returning result 34052 1727204424.65693: _execute() done 34052 1727204424.65700: dumping result to json 34052 1727204424.65707: done dumping result, returning 34052 1727204424.65718: done running TaskExecutor() for managed-node1/TASK: TEST: I can configure an interface with static ipv6 config [127b8e07-fff9-66a4-e2a3-00000000000f] 34052 1727204424.65726: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000f ok: [managed-node1] => {} MSG: ################################################## 34052 1727204424.65915: no more pending results, returning what we have 34052 1727204424.65918: results queue empty 34052 1727204424.65919: checking for any_errors_fatal 34052 1727204424.65927: done checking for any_errors_fatal 34052 1727204424.65928: checking for max_fail_percentage 34052 1727204424.65930: done checking for max_fail_percentage 34052 1727204424.65930: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.65931: done checking to see if all hosts have failed 34052 1727204424.65932: getting the remaining hosts for this loop 34052 1727204424.65934: done getting the remaining hosts for this loop 34052 1727204424.65939: getting the next task for host managed-node1 34052 1727204424.65947: done getting next task for host managed-node1 34052 1727204424.65953: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34052 1727204424.65956: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.65977: getting variables 34052 1727204424.65979: in VariableManager get_vars() 34052 1727204424.66028: Calling all_inventory to load vars for managed-node1 34052 1727204424.66031: Calling groups_inventory to load vars for managed-node1 34052 1727204424.66034: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.66046: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.66049: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.66052: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.66559: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000000f 34052 1727204424.66564: WORKER PROCESS EXITING 34052 1727204424.66595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.67178: done with get_vars() 34052 1727204424.67191: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.054) 0:00:10.990 ***** 34052 1727204424.67306: entering _queue_task() for managed-node1/include_tasks 34052 1727204424.67979: worker is 1 (out of 1 available) 34052 1727204424.68187: exiting _queue_task() for managed-node1/include_tasks 34052 1727204424.68203: done queuing things up, now waiting for results queue to drain 34052 1727204424.68269: waiting for pending results... 34052 1727204424.68661: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34052 1727204424.69222: in run() - task 127b8e07-fff9-66a4-e2a3-000000000017 34052 1727204424.69227: variable 'ansible_search_path' from source: unknown 34052 1727204424.69230: variable 'ansible_search_path' from source: unknown 34052 1727204424.69233: calling self._execute() 34052 1727204424.69466: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.69482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.69499: variable 'omit' from source: magic vars 34052 1727204424.69951: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.69973: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.69991: _execute() done 34052 1727204424.69998: dumping result to json 34052 1727204424.70004: done dumping result, returning 34052 1727204424.70072: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-66a4-e2a3-000000000017] 34052 1727204424.70075: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000017 34052 1727204424.70372: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000017 34052 1727204424.70377: WORKER PROCESS EXITING 34052 1727204424.70418: no more pending results, returning what we have 34052 1727204424.70422: in VariableManager get_vars() 34052 1727204424.70470: Calling all_inventory to load vars for managed-node1 34052 1727204424.70473: Calling groups_inventory to load vars for managed-node1 34052 1727204424.70476: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.70486: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.70489: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.70493: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.70792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.71054: done with get_vars() 34052 1727204424.71067: variable 'ansible_search_path' from source: unknown 34052 1727204424.71068: variable 'ansible_search_path' from source: unknown 34052 1727204424.71124: we have included files to process 34052 1727204424.71126: generating all_blocks data 34052 1727204424.71128: done generating all_blocks data 34052 1727204424.71132: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204424.71137: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204424.71140: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204424.72193: done processing included file 34052 1727204424.72196: iterating over new_blocks loaded from include file 34052 1727204424.72197: in VariableManager get_vars() 34052 1727204424.72230: done with get_vars() 34052 1727204424.72232: filtering new block on tags 34052 1727204424.72251: done filtering new block on tags 34052 1727204424.72255: in VariableManager get_vars() 34052 1727204424.72331: done with get_vars() 34052 1727204424.72333: filtering new block on tags 34052 1727204424.72357: done filtering new block on tags 34052 1727204424.72360: in VariableManager get_vars() 34052 1727204424.72387: done with get_vars() 34052 1727204424.72390: filtering new block on tags 34052 1727204424.72410: done filtering new block on tags 34052 1727204424.72412: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 34052 1727204424.72418: extending task lists for all hosts with included blocks 34052 1727204424.73487: done extending task lists 34052 1727204424.73489: done processing included files 34052 1727204424.73490: results queue empty 34052 1727204424.73491: checking for any_errors_fatal 34052 1727204424.73496: done checking for any_errors_fatal 34052 1727204424.73497: checking for max_fail_percentage 34052 1727204424.73498: done checking for max_fail_percentage 34052 1727204424.73499: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.73499: done checking to see if all hosts have failed 34052 1727204424.73500: getting the remaining hosts for this loop 34052 1727204424.73502: done getting the remaining hosts for this loop 34052 1727204424.73504: getting the next task for host managed-node1 34052 1727204424.73514: done getting next task for host managed-node1 34052 1727204424.73517: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34052 1727204424.73521: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.73532: getting variables 34052 1727204424.73534: in VariableManager get_vars() 34052 1727204424.73555: Calling all_inventory to load vars for managed-node1 34052 1727204424.73558: Calling groups_inventory to load vars for managed-node1 34052 1727204424.73560: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.73570: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.73573: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.73576: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.73792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.74081: done with get_vars() 34052 1727204424.74099: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.068) 0:00:11.059 ***** 34052 1727204424.74208: entering _queue_task() for managed-node1/setup 34052 1727204424.74709: worker is 1 (out of 1 available) 34052 1727204424.74720: exiting _queue_task() for managed-node1/setup 34052 1727204424.74732: done queuing things up, now waiting for results queue to drain 34052 1727204424.74733: waiting for pending results... 34052 1727204424.74939: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34052 1727204424.75080: in run() - task 127b8e07-fff9-66a4-e2a3-0000000001fc 34052 1727204424.75101: variable 'ansible_search_path' from source: unknown 34052 1727204424.75143: variable 'ansible_search_path' from source: unknown 34052 1727204424.75160: calling self._execute() 34052 1727204424.75264: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.75283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.75297: variable 'omit' from source: magic vars 34052 1727204424.75772: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.75791: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.76200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204424.79798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204424.79914: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204424.80049: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204424.80053: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204424.80070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204424.80177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204424.80214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204424.80250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204424.80307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204424.80329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204424.80403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204424.80485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204424.80489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204424.80518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204424.80539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204424.80737: variable '__network_required_facts' from source: role '' defaults 34052 1727204424.80751: variable 'ansible_facts' from source: unknown 34052 1727204424.80874: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34052 1727204424.80882: when evaluation is False, skipping this task 34052 1727204424.80889: _execute() done 34052 1727204424.80919: dumping result to json 34052 1727204424.80922: done dumping result, returning 34052 1727204424.80927: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-66a4-e2a3-0000000001fc] 34052 1727204424.80929: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001fc skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204424.81203: no more pending results, returning what we have 34052 1727204424.81208: results queue empty 34052 1727204424.81209: checking for any_errors_fatal 34052 1727204424.81210: done checking for any_errors_fatal 34052 1727204424.81211: checking for max_fail_percentage 34052 1727204424.81213: done checking for max_fail_percentage 34052 1727204424.81214: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.81215: done checking to see if all hosts have failed 34052 1727204424.81216: getting the remaining hosts for this loop 34052 1727204424.81218: done getting the remaining hosts for this loop 34052 1727204424.81223: getting the next task for host managed-node1 34052 1727204424.81234: done getting next task for host managed-node1 34052 1727204424.81245: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34052 1727204424.81250: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.81263: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001fc 34052 1727204424.81267: WORKER PROCESS EXITING 34052 1727204424.81278: getting variables 34052 1727204424.81280: in VariableManager get_vars() 34052 1727204424.81331: Calling all_inventory to load vars for managed-node1 34052 1727204424.81334: Calling groups_inventory to load vars for managed-node1 34052 1727204424.81337: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.81469: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.81474: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.81479: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.81977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.82418: done with get_vars() 34052 1727204424.82442: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.083) 0:00:11.143 ***** 34052 1727204424.82594: entering _queue_task() for managed-node1/stat 34052 1727204424.83074: worker is 1 (out of 1 available) 34052 1727204424.83093: exiting _queue_task() for managed-node1/stat 34052 1727204424.83106: done queuing things up, now waiting for results queue to drain 34052 1727204424.83107: waiting for pending results... 34052 1727204424.83329: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 34052 1727204424.83573: in run() - task 127b8e07-fff9-66a4-e2a3-0000000001fe 34052 1727204424.83600: variable 'ansible_search_path' from source: unknown 34052 1727204424.83604: variable 'ansible_search_path' from source: unknown 34052 1727204424.83763: calling self._execute() 34052 1727204424.83811: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.83822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.83839: variable 'omit' from source: magic vars 34052 1727204424.84284: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.84307: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.84518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204424.84939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204424.85045: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204424.85050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204424.85096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204424.85207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204424.85246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204424.85288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204424.85371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204424.85421: variable '__network_is_ostree' from source: set_fact 34052 1727204424.85436: Evaluated conditional (not __network_is_ostree is defined): False 34052 1727204424.85442: when evaluation is False, skipping this task 34052 1727204424.85448: _execute() done 34052 1727204424.85454: dumping result to json 34052 1727204424.85459: done dumping result, returning 34052 1727204424.85474: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-66a4-e2a3-0000000001fe] 34052 1727204424.85486: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001fe 34052 1727204424.85724: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001fe 34052 1727204424.85731: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34052 1727204424.85797: no more pending results, returning what we have 34052 1727204424.85807: results queue empty 34052 1727204424.85808: checking for any_errors_fatal 34052 1727204424.85815: done checking for any_errors_fatal 34052 1727204424.85816: checking for max_fail_percentage 34052 1727204424.85818: done checking for max_fail_percentage 34052 1727204424.85819: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.85820: done checking to see if all hosts have failed 34052 1727204424.85821: getting the remaining hosts for this loop 34052 1727204424.85823: done getting the remaining hosts for this loop 34052 1727204424.85832: getting the next task for host managed-node1 34052 1727204424.85839: done getting next task for host managed-node1 34052 1727204424.85844: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34052 1727204424.85849: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.85869: getting variables 34052 1727204424.85872: in VariableManager get_vars() 34052 1727204424.86093: Calling all_inventory to load vars for managed-node1 34052 1727204424.86097: Calling groups_inventory to load vars for managed-node1 34052 1727204424.86099: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.86111: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.86114: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.86117: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.86534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.86852: done with get_vars() 34052 1727204424.86868: done getting variables 34052 1727204424.86933: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.044) 0:00:11.187 ***** 34052 1727204424.87077: entering _queue_task() for managed-node1/set_fact 34052 1727204424.87460: worker is 1 (out of 1 available) 34052 1727204424.87606: exiting _queue_task() for managed-node1/set_fact 34052 1727204424.87619: done queuing things up, now waiting for results queue to drain 34052 1727204424.87620: waiting for pending results... 34052 1727204424.87942: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34052 1727204424.88019: in run() - task 127b8e07-fff9-66a4-e2a3-0000000001ff 34052 1727204424.88050: variable 'ansible_search_path' from source: unknown 34052 1727204424.88058: variable 'ansible_search_path' from source: unknown 34052 1727204424.88107: calling self._execute() 34052 1727204424.88220: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.88238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.88283: variable 'omit' from source: magic vars 34052 1727204424.88706: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.88802: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.88949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204424.89282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204424.89354: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204424.89414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204424.89464: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204424.89601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204424.89691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204424.89738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204424.89871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204424.89931: variable '__network_is_ostree' from source: set_fact 34052 1727204424.89945: Evaluated conditional (not __network_is_ostree is defined): False 34052 1727204424.89952: when evaluation is False, skipping this task 34052 1727204424.89959: _execute() done 34052 1727204424.89967: dumping result to json 34052 1727204424.89975: done dumping result, returning 34052 1727204424.89992: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-66a4-e2a3-0000000001ff] 34052 1727204424.90006: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001ff skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34052 1727204424.90274: no more pending results, returning what we have 34052 1727204424.90278: results queue empty 34052 1727204424.90279: checking for any_errors_fatal 34052 1727204424.90287: done checking for any_errors_fatal 34052 1727204424.90288: checking for max_fail_percentage 34052 1727204424.90289: done checking for max_fail_percentage 34052 1727204424.90290: checking to see if all hosts have failed and the running result is not ok 34052 1727204424.90291: done checking to see if all hosts have failed 34052 1727204424.90292: getting the remaining hosts for this loop 34052 1727204424.90294: done getting the remaining hosts for this loop 34052 1727204424.90298: getting the next task for host managed-node1 34052 1727204424.90309: done getting next task for host managed-node1 34052 1727204424.90313: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34052 1727204424.90318: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204424.90480: getting variables 34052 1727204424.90482: in VariableManager get_vars() 34052 1727204424.90527: Calling all_inventory to load vars for managed-node1 34052 1727204424.90530: Calling groups_inventory to load vars for managed-node1 34052 1727204424.90533: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204424.90542: Calling all_plugins_play to load vars for managed-node1 34052 1727204424.90545: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204424.90548: Calling groups_plugins_play to load vars for managed-node1 34052 1727204424.90850: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000001ff 34052 1727204424.90854: WORKER PROCESS EXITING 34052 1727204424.90883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204424.91162: done with get_vars() 34052 1727204424.91178: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:24 -0400 (0:00:00.042) 0:00:11.230 ***** 34052 1727204424.91296: entering _queue_task() for managed-node1/service_facts 34052 1727204424.91298: Creating lock for service_facts 34052 1727204424.91793: worker is 1 (out of 1 available) 34052 1727204424.91805: exiting _queue_task() for managed-node1/service_facts 34052 1727204424.91817: done queuing things up, now waiting for results queue to drain 34052 1727204424.91819: waiting for pending results... 34052 1727204424.92049: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 34052 1727204424.92199: in run() - task 127b8e07-fff9-66a4-e2a3-000000000201 34052 1727204424.92228: variable 'ansible_search_path' from source: unknown 34052 1727204424.92238: variable 'ansible_search_path' from source: unknown 34052 1727204424.92295: calling self._execute() 34052 1727204424.92427: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.92443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.92459: variable 'omit' from source: magic vars 34052 1727204424.93152: variable 'ansible_distribution_major_version' from source: facts 34052 1727204424.93176: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204424.93196: variable 'omit' from source: magic vars 34052 1727204424.93300: variable 'omit' from source: magic vars 34052 1727204424.93352: variable 'omit' from source: magic vars 34052 1727204424.93424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204424.93520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204424.93524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204424.93547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.93571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204424.93829: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204424.93832: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.93835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.93874: Set connection var ansible_connection to ssh 34052 1727204424.93890: Set connection var ansible_timeout to 10 34052 1727204424.93902: Set connection var ansible_pipelining to False 34052 1727204424.93909: Set connection var ansible_shell_type to sh 34052 1727204424.93922: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204424.93943: Set connection var ansible_shell_executable to /bin/sh 34052 1727204424.93986: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.93994: variable 'ansible_connection' from source: unknown 34052 1727204424.94002: variable 'ansible_module_compression' from source: unknown 34052 1727204424.94011: variable 'ansible_shell_type' from source: unknown 34052 1727204424.94018: variable 'ansible_shell_executable' from source: unknown 34052 1727204424.94029: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204424.94042: variable 'ansible_pipelining' from source: unknown 34052 1727204424.94049: variable 'ansible_timeout' from source: unknown 34052 1727204424.94062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204424.94293: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204424.94311: variable 'omit' from source: magic vars 34052 1727204424.94320: starting attempt loop 34052 1727204424.94369: running the handler 34052 1727204424.94372: _low_level_execute_command(): starting 34052 1727204424.94377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204424.95262: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.95357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.95386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.95464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204424.97233: stdout chunk (state=3): >>>/root <<< 34052 1727204424.97451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204424.97455: stdout chunk (state=3): >>><<< 34052 1727204424.97458: stderr chunk (state=3): >>><<< 34052 1727204424.97482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204424.97504: _low_level_execute_command(): starting 34052 1727204424.97595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848 `" && echo ansible-tmp-1727204424.974893-34822-93522011081848="` echo /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848 `" ) && sleep 0' 34052 1727204424.98218: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204424.98238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204424.98254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204424.98278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204424.98333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204424.98415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204424.98435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204424.98459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204424.98573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204425.00682: stdout chunk (state=3): >>>ansible-tmp-1727204424.974893-34822-93522011081848=/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848 <<< 34052 1727204425.01073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204425.01077: stdout chunk (state=3): >>><<< 34052 1727204425.01080: stderr chunk (state=3): >>><<< 34052 1727204425.01083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204424.974893-34822-93522011081848=/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204425.01085: variable 'ansible_module_compression' from source: unknown 34052 1727204425.01088: ANSIBALLZ: Using lock for service_facts 34052 1727204425.01090: ANSIBALLZ: Acquiring lock 34052 1727204425.01092: ANSIBALLZ: Lock acquired: 140141525728048 34052 1727204425.01094: ANSIBALLZ: Creating module 34052 1727204425.22743: ANSIBALLZ: Writing module into payload 34052 1727204425.22972: ANSIBALLZ: Writing module 34052 1727204425.23375: ANSIBALLZ: Renaming module 34052 1727204425.23379: ANSIBALLZ: Done creating module 34052 1727204425.23382: variable 'ansible_facts' from source: unknown 34052 1727204425.23384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py 34052 1727204425.23686: Sending initial data 34052 1727204425.23691: Sent initial data (160 bytes) 34052 1727204425.24929: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204425.25032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204425.25075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204425.25094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204425.25119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204425.25251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204425.27013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204425.27110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204425.27163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpie_glp4o /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py <<< 34052 1727204425.27271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py" <<< 34052 1727204425.27308: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpie_glp4o" to remote "/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py" <<< 34052 1727204425.28821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204425.28829: stderr chunk (state=3): >>><<< 34052 1727204425.28832: stdout chunk (state=3): >>><<< 34052 1727204425.28834: done transferring module to remote 34052 1727204425.28837: _low_level_execute_command(): starting 34052 1727204425.28839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/ /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py && sleep 0' 34052 1727204425.30237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204425.30286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204425.30329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204425.30474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204425.30528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204425.32729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204425.32835: stderr chunk (state=3): >>><<< 34052 1727204425.32839: stdout chunk (state=3): >>><<< 34052 1727204425.32841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204425.32843: _low_level_execute_command(): starting 34052 1727204425.32846: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/AnsiballZ_service_facts.py && sleep 0' 34052 1727204425.34453: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204425.34561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204425.34590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204425.34734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204427.68700: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "sourc<<< 34052 1727204427.68752: stdout chunk (state=3): >>>e": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plym<<< 34052 1727204427.68784: stdout chunk (state=3): >>>outh-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34052 1727204427.70554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204427.70582: stdout chunk (state=3): >>><<< 34052 1727204427.70596: stderr chunk (state=3): >>><<< 34052 1727204427.70627: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204427.71574: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204427.71578: _low_level_execute_command(): starting 34052 1727204427.71581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204424.974893-34822-93522011081848/ > /dev/null 2>&1 && sleep 0' 34052 1727204427.72279: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204427.72295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204427.72391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204427.72429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204427.72449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204427.72476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204427.72561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204427.74669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204427.74674: stdout chunk (state=3): >>><<< 34052 1727204427.74676: stderr chunk (state=3): >>><<< 34052 1727204427.74771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204427.74775: handler run complete 34052 1727204427.75173: variable 'ansible_facts' from source: unknown 34052 1727204427.77344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204427.78401: variable 'ansible_facts' from source: unknown 34052 1727204427.78472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204427.78769: attempt loop complete, returning result 34052 1727204427.78782: _execute() done 34052 1727204427.78789: dumping result to json 34052 1727204427.78882: done dumping result, returning 34052 1727204427.78898: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-66a4-e2a3-000000000201] 34052 1727204427.78907: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000201 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204427.80398: no more pending results, returning what we have 34052 1727204427.80400: results queue empty 34052 1727204427.80401: checking for any_errors_fatal 34052 1727204427.80405: done checking for any_errors_fatal 34052 1727204427.80406: checking for max_fail_percentage 34052 1727204427.80408: done checking for max_fail_percentage 34052 1727204427.80409: checking to see if all hosts have failed and the running result is not ok 34052 1727204427.80410: done checking to see if all hosts have failed 34052 1727204427.80410: getting the remaining hosts for this loop 34052 1727204427.80412: done getting the remaining hosts for this loop 34052 1727204427.80415: getting the next task for host managed-node1 34052 1727204427.80421: done getting next task for host managed-node1 34052 1727204427.80425: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34052 1727204427.80430: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204427.80446: getting variables 34052 1727204427.80448: in VariableManager get_vars() 34052 1727204427.80489: Calling all_inventory to load vars for managed-node1 34052 1727204427.80492: Calling groups_inventory to load vars for managed-node1 34052 1727204427.80497: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204427.80506: Calling all_plugins_play to load vars for managed-node1 34052 1727204427.80509: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204427.80513: Calling groups_plugins_play to load vars for managed-node1 34052 1727204427.81028: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000201 34052 1727204427.81032: WORKER PROCESS EXITING 34052 1727204427.81058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204427.81713: done with get_vars() 34052 1727204427.81731: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:27 -0400 (0:00:02.905) 0:00:14.135 ***** 34052 1727204427.81833: entering _queue_task() for managed-node1/package_facts 34052 1727204427.81839: Creating lock for package_facts 34052 1727204427.82322: worker is 1 (out of 1 available) 34052 1727204427.82334: exiting _queue_task() for managed-node1/package_facts 34052 1727204427.82347: done queuing things up, now waiting for results queue to drain 34052 1727204427.82348: waiting for pending results... 34052 1727204427.82556: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 34052 1727204427.82737: in run() - task 127b8e07-fff9-66a4-e2a3-000000000202 34052 1727204427.82760: variable 'ansible_search_path' from source: unknown 34052 1727204427.82772: variable 'ansible_search_path' from source: unknown 34052 1727204427.82822: calling self._execute() 34052 1727204427.82945: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204427.82948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204427.82957: variable 'omit' from source: magic vars 34052 1727204427.83493: variable 'ansible_distribution_major_version' from source: facts 34052 1727204427.83497: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204427.83499: variable 'omit' from source: magic vars 34052 1727204427.83501: variable 'omit' from source: magic vars 34052 1727204427.83542: variable 'omit' from source: magic vars 34052 1727204427.83588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204427.83638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204427.83662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204427.83687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204427.83709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204427.83751: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204427.83760: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204427.83771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204427.83892: Set connection var ansible_connection to ssh 34052 1727204427.83904: Set connection var ansible_timeout to 10 34052 1727204427.83926: Set connection var ansible_pipelining to False 34052 1727204427.83928: Set connection var ansible_shell_type to sh 34052 1727204427.83933: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204427.83972: Set connection var ansible_shell_executable to /bin/sh 34052 1727204427.83980: variable 'ansible_shell_executable' from source: unknown 34052 1727204427.83988: variable 'ansible_connection' from source: unknown 34052 1727204427.83994: variable 'ansible_module_compression' from source: unknown 34052 1727204427.83999: variable 'ansible_shell_type' from source: unknown 34052 1727204427.84004: variable 'ansible_shell_executable' from source: unknown 34052 1727204427.84011: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204427.84017: variable 'ansible_pipelining' from source: unknown 34052 1727204427.84023: variable 'ansible_timeout' from source: unknown 34052 1727204427.84033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204427.84301: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204427.84306: variable 'omit' from source: magic vars 34052 1727204427.84309: starting attempt loop 34052 1727204427.84311: running the handler 34052 1727204427.84331: _low_level_execute_command(): starting 34052 1727204427.84345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204427.85238: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204427.85260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204427.85333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204427.85359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204427.85419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204427.85483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204427.87268: stdout chunk (state=3): >>>/root <<< 34052 1727204427.87387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204427.87497: stderr chunk (state=3): >>><<< 34052 1727204427.87501: stdout chunk (state=3): >>><<< 34052 1727204427.87519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204427.87573: _low_level_execute_command(): starting 34052 1727204427.87577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141 `" && echo ansible-tmp-1727204427.875257-34990-214625709327141="` echo /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141 `" ) && sleep 0' 34052 1727204427.88256: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204427.88376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204427.88421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204427.88512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204427.90826: stdout chunk (state=3): >>>ansible-tmp-1727204427.875257-34990-214625709327141=/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141 <<< 34052 1727204427.90977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204427.90981: stdout chunk (state=3): >>><<< 34052 1727204427.90983: stderr chunk (state=3): >>><<< 34052 1727204427.90986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204427.875257-34990-214625709327141=/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204427.91064: variable 'ansible_module_compression' from source: unknown 34052 1727204427.91147: ANSIBALLZ: Using lock for package_facts 34052 1727204427.91365: ANSIBALLZ: Acquiring lock 34052 1727204427.91370: ANSIBALLZ: Lock acquired: 140141525727616 34052 1727204427.91372: ANSIBALLZ: Creating module 34052 1727204428.33053: ANSIBALLZ: Writing module into payload 34052 1727204428.33447: ANSIBALLZ: Writing module 34052 1727204428.33486: ANSIBALLZ: Renaming module 34052 1727204428.33490: ANSIBALLZ: Done creating module 34052 1727204428.33537: variable 'ansible_facts' from source: unknown 34052 1727204428.33789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py 34052 1727204428.34278: Sending initial data 34052 1727204428.34285: Sent initial data (161 bytes) 34052 1727204428.35254: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204428.35258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204428.35261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204428.35274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204428.35333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204428.37043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204428.37217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204428.37273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp0oro20y8 /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py <<< 34052 1727204428.37277: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py" <<< 34052 1727204428.37312: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp0oro20y8" to remote "/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py" <<< 34052 1727204428.40378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204428.40406: stderr chunk (state=3): >>><<< 34052 1727204428.40410: stdout chunk (state=3): >>><<< 34052 1727204428.40441: done transferring module to remote 34052 1727204428.40454: _low_level_execute_command(): starting 34052 1727204428.40460: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/ /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py && sleep 0' 34052 1727204428.41719: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204428.41777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204428.41989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204428.42010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204428.42030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204428.42047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204428.42135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204428.44107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204428.44332: stderr chunk (state=3): >>><<< 34052 1727204428.44344: stdout chunk (state=3): >>><<< 34052 1727204428.44398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204428.44408: _low_level_execute_command(): starting 34052 1727204428.44572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/AnsiballZ_package_facts.py && sleep 0' 34052 1727204428.46019: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204428.46024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204428.46027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204428.46101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204429.11022: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 34052 1727204429.11135: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 34052 1727204429.11273: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 34052 1727204429.11282: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34052 1727204429.13671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204429.13676: stdout chunk (state=3): >>><<< 34052 1727204429.13710: stderr chunk (state=3): >>><<< 34052 1727204429.13762: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204429.20475: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204429.20480: _low_level_execute_command(): starting 34052 1727204429.20483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204427.875257-34990-214625709327141/ > /dev/null 2>&1 && sleep 0' 34052 1727204429.21059: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204429.21252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204429.21260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204429.21264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204429.21268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204429.23287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204429.23369: stderr chunk (state=3): >>><<< 34052 1727204429.23388: stdout chunk (state=3): >>><<< 34052 1727204429.23409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204429.23421: handler run complete 34052 1727204429.24671: variable 'ansible_facts' from source: unknown 34052 1727204429.25816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.29038: variable 'ansible_facts' from source: unknown 34052 1727204429.44033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.45881: attempt loop complete, returning result 34052 1727204429.45918: _execute() done 34052 1727204429.45930: dumping result to json 34052 1727204429.46257: done dumping result, returning 34052 1727204429.46277: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-66a4-e2a3-000000000202] 34052 1727204429.46286: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000202 34052 1727204429.49914: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000202 34052 1727204429.49918: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204429.50012: no more pending results, returning what we have 34052 1727204429.50014: results queue empty 34052 1727204429.50015: checking for any_errors_fatal 34052 1727204429.50019: done checking for any_errors_fatal 34052 1727204429.50020: checking for max_fail_percentage 34052 1727204429.50021: done checking for max_fail_percentage 34052 1727204429.50022: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.50023: done checking to see if all hosts have failed 34052 1727204429.50024: getting the remaining hosts for this loop 34052 1727204429.50027: done getting the remaining hosts for this loop 34052 1727204429.50031: getting the next task for host managed-node1 34052 1727204429.50038: done getting next task for host managed-node1 34052 1727204429.50043: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34052 1727204429.50046: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.50063: getting variables 34052 1727204429.50069: in VariableManager get_vars() 34052 1727204429.50106: Calling all_inventory to load vars for managed-node1 34052 1727204429.50109: Calling groups_inventory to load vars for managed-node1 34052 1727204429.50112: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.50121: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.50124: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.50130: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.51853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.54027: done with get_vars() 34052 1727204429.54048: done getting variables 34052 1727204429.54102: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:00:29 -0400 (0:00:01.722) 0:00:15.858 ***** 34052 1727204429.54132: entering _queue_task() for managed-node1/debug 34052 1727204429.54410: worker is 1 (out of 1 available) 34052 1727204429.54429: exiting _queue_task() for managed-node1/debug 34052 1727204429.54441: done queuing things up, now waiting for results queue to drain 34052 1727204429.54443: waiting for pending results... 34052 1727204429.54631: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 34052 1727204429.54727: in run() - task 127b8e07-fff9-66a4-e2a3-000000000018 34052 1727204429.54737: variable 'ansible_search_path' from source: unknown 34052 1727204429.54741: variable 'ansible_search_path' from source: unknown 34052 1727204429.54775: calling self._execute() 34052 1727204429.54852: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.54858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.54868: variable 'omit' from source: magic vars 34052 1727204429.55261: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.55273: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204429.55280: variable 'omit' from source: magic vars 34052 1727204429.55323: variable 'omit' from source: magic vars 34052 1727204429.55404: variable 'network_provider' from source: set_fact 34052 1727204429.55417: variable 'omit' from source: magic vars 34052 1727204429.55456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204429.55490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204429.55508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204429.55523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204429.55533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204429.55562: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204429.55568: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.55571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.55643: Set connection var ansible_connection to ssh 34052 1727204429.55652: Set connection var ansible_timeout to 10 34052 1727204429.55655: Set connection var ansible_pipelining to False 34052 1727204429.55658: Set connection var ansible_shell_type to sh 34052 1727204429.55668: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204429.55676: Set connection var ansible_shell_executable to /bin/sh 34052 1727204429.55701: variable 'ansible_shell_executable' from source: unknown 34052 1727204429.55705: variable 'ansible_connection' from source: unknown 34052 1727204429.55708: variable 'ansible_module_compression' from source: unknown 34052 1727204429.55711: variable 'ansible_shell_type' from source: unknown 34052 1727204429.55713: variable 'ansible_shell_executable' from source: unknown 34052 1727204429.55715: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.55718: variable 'ansible_pipelining' from source: unknown 34052 1727204429.55720: variable 'ansible_timeout' from source: unknown 34052 1727204429.55724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.55842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204429.55852: variable 'omit' from source: magic vars 34052 1727204429.55855: starting attempt loop 34052 1727204429.55859: running the handler 34052 1727204429.55906: handler run complete 34052 1727204429.55916: attempt loop complete, returning result 34052 1727204429.55919: _execute() done 34052 1727204429.55922: dumping result to json 34052 1727204429.55927: done dumping result, returning 34052 1727204429.55931: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-66a4-e2a3-000000000018] 34052 1727204429.55937: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000018 34052 1727204429.56072: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000018 34052 1727204429.56077: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 34052 1727204429.56150: no more pending results, returning what we have 34052 1727204429.56153: results queue empty 34052 1727204429.56154: checking for any_errors_fatal 34052 1727204429.56164: done checking for any_errors_fatal 34052 1727204429.56205: checking for max_fail_percentage 34052 1727204429.56208: done checking for max_fail_percentage 34052 1727204429.56209: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.56212: done checking to see if all hosts have failed 34052 1727204429.56213: getting the remaining hosts for this loop 34052 1727204429.56224: done getting the remaining hosts for this loop 34052 1727204429.56235: getting the next task for host managed-node1 34052 1727204429.56327: done getting next task for host managed-node1 34052 1727204429.56332: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34052 1727204429.56335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.56353: getting variables 34052 1727204429.56355: in VariableManager get_vars() 34052 1727204429.56461: Calling all_inventory to load vars for managed-node1 34052 1727204429.56463: Calling groups_inventory to load vars for managed-node1 34052 1727204429.56468: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.56477: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.56480: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.56483: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.58518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.60089: done with get_vars() 34052 1727204429.60130: done getting variables 34052 1727204429.60201: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.061) 0:00:15.919 ***** 34052 1727204429.60241: entering _queue_task() for managed-node1/fail 34052 1727204429.60602: worker is 1 (out of 1 available) 34052 1727204429.60617: exiting _queue_task() for managed-node1/fail 34052 1727204429.60635: done queuing things up, now waiting for results queue to drain 34052 1727204429.60637: waiting for pending results... 34052 1727204429.61197: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34052 1727204429.61203: in run() - task 127b8e07-fff9-66a4-e2a3-000000000019 34052 1727204429.61205: variable 'ansible_search_path' from source: unknown 34052 1727204429.61208: variable 'ansible_search_path' from source: unknown 34052 1727204429.61211: calling self._execute() 34052 1727204429.61309: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.61322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.61342: variable 'omit' from source: magic vars 34052 1727204429.62433: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.62438: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204429.62619: variable 'network_state' from source: role '' defaults 34052 1727204429.62667: Evaluated conditional (network_state != {}): False 34052 1727204429.62715: when evaluation is False, skipping this task 34052 1727204429.62728: _execute() done 34052 1727204429.62737: dumping result to json 34052 1727204429.62746: done dumping result, returning 34052 1727204429.62764: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-66a4-e2a3-000000000019] 34052 1727204429.62778: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000019 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204429.63088: no more pending results, returning what we have 34052 1727204429.63095: results queue empty 34052 1727204429.63097: checking for any_errors_fatal 34052 1727204429.63106: done checking for any_errors_fatal 34052 1727204429.63107: checking for max_fail_percentage 34052 1727204429.63109: done checking for max_fail_percentage 34052 1727204429.63110: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.63111: done checking to see if all hosts have failed 34052 1727204429.63111: getting the remaining hosts for this loop 34052 1727204429.63113: done getting the remaining hosts for this loop 34052 1727204429.63123: getting the next task for host managed-node1 34052 1727204429.63132: done getting next task for host managed-node1 34052 1727204429.63137: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34052 1727204429.63142: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.63162: getting variables 34052 1727204429.63164: in VariableManager get_vars() 34052 1727204429.63217: Calling all_inventory to load vars for managed-node1 34052 1727204429.63221: Calling groups_inventory to load vars for managed-node1 34052 1727204429.63224: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.63241: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.63245: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.63249: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.63785: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000019 34052 1727204429.63790: WORKER PROCESS EXITING 34052 1727204429.65481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.67676: done with get_vars() 34052 1727204429.67715: done getting variables 34052 1727204429.67789: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.075) 0:00:15.995 ***** 34052 1727204429.67829: entering _queue_task() for managed-node1/fail 34052 1727204429.68211: worker is 1 (out of 1 available) 34052 1727204429.68227: exiting _queue_task() for managed-node1/fail 34052 1727204429.68243: done queuing things up, now waiting for results queue to drain 34052 1727204429.68244: waiting for pending results... 34052 1727204429.68555: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34052 1727204429.68717: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001a 34052 1727204429.68742: variable 'ansible_search_path' from source: unknown 34052 1727204429.68751: variable 'ansible_search_path' from source: unknown 34052 1727204429.68801: calling self._execute() 34052 1727204429.68911: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.69020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.69024: variable 'omit' from source: magic vars 34052 1727204429.69376: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.69396: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204429.69537: variable 'network_state' from source: role '' defaults 34052 1727204429.69553: Evaluated conditional (network_state != {}): False 34052 1727204429.69572: when evaluation is False, skipping this task 34052 1727204429.69581: _execute() done 34052 1727204429.69587: dumping result to json 34052 1727204429.69595: done dumping result, returning 34052 1727204429.69609: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-66a4-e2a3-00000000001a] 34052 1727204429.69619: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001a 34052 1727204429.69972: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001a 34052 1727204429.69976: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204429.70019: no more pending results, returning what we have 34052 1727204429.70023: results queue empty 34052 1727204429.70024: checking for any_errors_fatal 34052 1727204429.70032: done checking for any_errors_fatal 34052 1727204429.70033: checking for max_fail_percentage 34052 1727204429.70035: done checking for max_fail_percentage 34052 1727204429.70036: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.70037: done checking to see if all hosts have failed 34052 1727204429.70037: getting the remaining hosts for this loop 34052 1727204429.70039: done getting the remaining hosts for this loop 34052 1727204429.70043: getting the next task for host managed-node1 34052 1727204429.70048: done getting next task for host managed-node1 34052 1727204429.70052: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34052 1727204429.70056: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.70074: getting variables 34052 1727204429.70076: in VariableManager get_vars() 34052 1727204429.70116: Calling all_inventory to load vars for managed-node1 34052 1727204429.70118: Calling groups_inventory to load vars for managed-node1 34052 1727204429.70121: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.70133: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.70136: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.70139: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.72056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.75480: done with get_vars() 34052 1727204429.75527: done getting variables 34052 1727204429.75715: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.079) 0:00:16.074 ***** 34052 1727204429.75755: entering _queue_task() for managed-node1/fail 34052 1727204429.76272: worker is 1 (out of 1 available) 34052 1727204429.76286: exiting _queue_task() for managed-node1/fail 34052 1727204429.76297: done queuing things up, now waiting for results queue to drain 34052 1727204429.76299: waiting for pending results... 34052 1727204429.76546: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34052 1727204429.76748: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001b 34052 1727204429.76752: variable 'ansible_search_path' from source: unknown 34052 1727204429.76755: variable 'ansible_search_path' from source: unknown 34052 1727204429.76762: calling self._execute() 34052 1727204429.76873: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.76886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.76902: variable 'omit' from source: magic vars 34052 1727204429.77339: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.77358: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204429.77574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204429.80304: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204429.80577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204429.80582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204429.80690: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204429.80728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204429.80944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204429.81172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204429.81176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.81179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204429.81446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204429.81578: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.81603: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34052 1727204429.81864: variable 'ansible_distribution' from source: facts 34052 1727204429.81994: variable '__network_rh_distros' from source: role '' defaults 34052 1727204429.82004: Evaluated conditional (ansible_distribution in __network_rh_distros): False 34052 1727204429.82015: when evaluation is False, skipping this task 34052 1727204429.82024: _execute() done 34052 1727204429.82036: dumping result to json 34052 1727204429.82045: done dumping result, returning 34052 1727204429.82061: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-66a4-e2a3-00000000001b] 34052 1727204429.82073: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001b 34052 1727204429.82399: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001b 34052 1727204429.82403: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 34052 1727204429.82478: no more pending results, returning what we have 34052 1727204429.82482: results queue empty 34052 1727204429.82483: checking for any_errors_fatal 34052 1727204429.82490: done checking for any_errors_fatal 34052 1727204429.82491: checking for max_fail_percentage 34052 1727204429.82493: done checking for max_fail_percentage 34052 1727204429.82493: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.82494: done checking to see if all hosts have failed 34052 1727204429.82495: getting the remaining hosts for this loop 34052 1727204429.82497: done getting the remaining hosts for this loop 34052 1727204429.82502: getting the next task for host managed-node1 34052 1727204429.82508: done getting next task for host managed-node1 34052 1727204429.82513: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34052 1727204429.82517: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.82536: getting variables 34052 1727204429.82539: in VariableManager get_vars() 34052 1727204429.82589: Calling all_inventory to load vars for managed-node1 34052 1727204429.82592: Calling groups_inventory to load vars for managed-node1 34052 1727204429.82595: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.82607: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.82610: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.82614: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.86524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204429.89242: done with get_vars() 34052 1727204429.89289: done getting variables 34052 1727204429.89415: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.137) 0:00:16.211 ***** 34052 1727204429.89460: entering _queue_task() for managed-node1/dnf 34052 1727204429.89968: worker is 1 (out of 1 available) 34052 1727204429.89982: exiting _queue_task() for managed-node1/dnf 34052 1727204429.89999: done queuing things up, now waiting for results queue to drain 34052 1727204429.90001: waiting for pending results... 34052 1727204429.90240: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34052 1727204429.90480: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001c 34052 1727204429.90484: variable 'ansible_search_path' from source: unknown 34052 1727204429.90487: variable 'ansible_search_path' from source: unknown 34052 1727204429.90490: calling self._execute() 34052 1727204429.90602: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204429.90606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204429.90672: variable 'omit' from source: magic vars 34052 1727204429.91203: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.91208: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204429.91432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204429.93169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204429.93524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204429.93555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204429.93587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204429.93608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204429.93680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204429.93703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204429.93722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.93752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204429.93765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204429.93880: variable 'ansible_distribution' from source: facts 34052 1727204429.93884: variable 'ansible_distribution_major_version' from source: facts 34052 1727204429.93886: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34052 1727204429.94011: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204429.94140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204429.94143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204429.94185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.94209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204429.94255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204429.94356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204429.94360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204429.94364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.94368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204429.94432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204429.94486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204429.94537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204429.94607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.94872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204429.94875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204429.94878: variable 'network_connections' from source: task vars 34052 1727204429.94881: variable 'interface' from source: play vars 34052 1727204429.94964: variable 'interface' from source: play vars 34052 1727204429.95077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204429.95286: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204429.95328: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204429.95349: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204429.95375: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204429.95414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204429.95435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204429.95459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204429.95480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204429.95534: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204429.95762: variable 'network_connections' from source: task vars 34052 1727204429.95769: variable 'interface' from source: play vars 34052 1727204429.95821: variable 'interface' from source: play vars 34052 1727204429.95851: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204429.95854: when evaluation is False, skipping this task 34052 1727204429.95857: _execute() done 34052 1727204429.95859: dumping result to json 34052 1727204429.95862: done dumping result, returning 34052 1727204429.95872: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-00000000001c] 34052 1727204429.95880: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001c 34052 1727204429.95973: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001c 34052 1727204429.95977: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204429.96055: no more pending results, returning what we have 34052 1727204429.96058: results queue empty 34052 1727204429.96059: checking for any_errors_fatal 34052 1727204429.96064: done checking for any_errors_fatal 34052 1727204429.96067: checking for max_fail_percentage 34052 1727204429.96069: done checking for max_fail_percentage 34052 1727204429.96070: checking to see if all hosts have failed and the running result is not ok 34052 1727204429.96070: done checking to see if all hosts have failed 34052 1727204429.96071: getting the remaining hosts for this loop 34052 1727204429.96073: done getting the remaining hosts for this loop 34052 1727204429.96078: getting the next task for host managed-node1 34052 1727204429.96084: done getting next task for host managed-node1 34052 1727204429.96089: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34052 1727204429.96093: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204429.96111: getting variables 34052 1727204429.96112: in VariableManager get_vars() 34052 1727204429.96160: Calling all_inventory to load vars for managed-node1 34052 1727204429.96163: Calling groups_inventory to load vars for managed-node1 34052 1727204429.96270: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204429.96283: Calling all_plugins_play to load vars for managed-node1 34052 1727204429.96286: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204429.96289: Calling groups_plugins_play to load vars for managed-node1 34052 1727204429.98096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.00358: done with get_vars() 34052 1727204430.00398: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34052 1727204430.00479: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.110) 0:00:16.322 ***** 34052 1727204430.00519: entering _queue_task() for managed-node1/yum 34052 1727204430.00521: Creating lock for yum 34052 1727204430.00951: worker is 1 (out of 1 available) 34052 1727204430.00970: exiting _queue_task() for managed-node1/yum 34052 1727204430.00985: done queuing things up, now waiting for results queue to drain 34052 1727204430.00987: waiting for pending results... 34052 1727204430.01403: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34052 1727204430.01557: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001d 34052 1727204430.01591: variable 'ansible_search_path' from source: unknown 34052 1727204430.01601: variable 'ansible_search_path' from source: unknown 34052 1727204430.01665: calling self._execute() 34052 1727204430.01769: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.01775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.01784: variable 'omit' from source: magic vars 34052 1727204430.02373: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.02378: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.02449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204430.05098: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204430.05198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204430.05239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204430.05288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204430.05316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204430.05421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.05456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.05492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.05536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.05551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.05703: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.05724: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34052 1727204430.05740: when evaluation is False, skipping this task 34052 1727204430.05757: _execute() done 34052 1727204430.05760: dumping result to json 34052 1727204430.05763: done dumping result, returning 34052 1727204430.05775: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-00000000001d] 34052 1727204430.05779: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001d 34052 1727204430.05885: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001d 34052 1727204430.05888: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34052 1727204430.05966: no more pending results, returning what we have 34052 1727204430.05970: results queue empty 34052 1727204430.05971: checking for any_errors_fatal 34052 1727204430.05978: done checking for any_errors_fatal 34052 1727204430.05979: checking for max_fail_percentage 34052 1727204430.05981: done checking for max_fail_percentage 34052 1727204430.05982: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.05982: done checking to see if all hosts have failed 34052 1727204430.05983: getting the remaining hosts for this loop 34052 1727204430.05985: done getting the remaining hosts for this loop 34052 1727204430.05990: getting the next task for host managed-node1 34052 1727204430.05996: done getting next task for host managed-node1 34052 1727204430.06002: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34052 1727204430.06005: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.06023: getting variables 34052 1727204430.06024: in VariableManager get_vars() 34052 1727204430.06107: Calling all_inventory to load vars for managed-node1 34052 1727204430.06111: Calling groups_inventory to load vars for managed-node1 34052 1727204430.06114: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.06124: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.06127: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.06129: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.07167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.13489: done with get_vars() 34052 1727204430.13529: done getting variables 34052 1727204430.13587: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.130) 0:00:16.453 ***** 34052 1727204430.13621: entering _queue_task() for managed-node1/fail 34052 1727204430.14200: worker is 1 (out of 1 available) 34052 1727204430.14218: exiting _queue_task() for managed-node1/fail 34052 1727204430.14233: done queuing things up, now waiting for results queue to drain 34052 1727204430.14235: waiting for pending results... 34052 1727204430.14573: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34052 1727204430.14746: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001e 34052 1727204430.14751: variable 'ansible_search_path' from source: unknown 34052 1727204430.14756: variable 'ansible_search_path' from source: unknown 34052 1727204430.14760: calling self._execute() 34052 1727204430.14854: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.14862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.14883: variable 'omit' from source: magic vars 34052 1727204430.15324: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.15336: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.15437: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204430.15689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204430.18114: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204430.18198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204430.18236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204430.18286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204430.18353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204430.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.18441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.18499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.18555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.18560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.18607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.18625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.18806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.18813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.18816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.18818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.18821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.18823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.18921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.18924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.19117: variable 'network_connections' from source: task vars 34052 1727204430.19134: variable 'interface' from source: play vars 34052 1727204430.19219: variable 'interface' from source: play vars 34052 1727204430.19390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204430.19613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204430.19617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204430.19620: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204430.19661: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204430.19715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204430.19739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204430.19766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.19793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204430.19863: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204430.20156: variable 'network_connections' from source: task vars 34052 1727204430.20180: variable 'interface' from source: play vars 34052 1727204430.20247: variable 'interface' from source: play vars 34052 1727204430.20287: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204430.20291: when evaluation is False, skipping this task 34052 1727204430.20294: _execute() done 34052 1727204430.20296: dumping result to json 34052 1727204430.20298: done dumping result, returning 34052 1727204430.20391: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-00000000001e] 34052 1727204430.20394: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001e skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204430.20758: no more pending results, returning what we have 34052 1727204430.20763: results queue empty 34052 1727204430.20764: checking for any_errors_fatal 34052 1727204430.20772: done checking for any_errors_fatal 34052 1727204430.20776: checking for max_fail_percentage 34052 1727204430.20778: done checking for max_fail_percentage 34052 1727204430.20779: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.20780: done checking to see if all hosts have failed 34052 1727204430.20781: getting the remaining hosts for this loop 34052 1727204430.20782: done getting the remaining hosts for this loop 34052 1727204430.20787: getting the next task for host managed-node1 34052 1727204430.20796: done getting next task for host managed-node1 34052 1727204430.20801: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34052 1727204430.20804: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.20819: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001e 34052 1727204430.20822: WORKER PROCESS EXITING 34052 1727204430.20832: getting variables 34052 1727204430.20834: in VariableManager get_vars() 34052 1727204430.20890: Calling all_inventory to load vars for managed-node1 34052 1727204430.20899: Calling groups_inventory to load vars for managed-node1 34052 1727204430.20902: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.20913: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.20916: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.20919: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.22857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.25351: done with get_vars() 34052 1727204430.25401: done getting variables 34052 1727204430.25471: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.119) 0:00:16.572 ***** 34052 1727204430.25539: entering _queue_task() for managed-node1/package 34052 1727204430.25946: worker is 1 (out of 1 available) 34052 1727204430.25960: exiting _queue_task() for managed-node1/package 34052 1727204430.25977: done queuing things up, now waiting for results queue to drain 34052 1727204430.25979: waiting for pending results... 34052 1727204430.26365: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 34052 1727204430.26528: in run() - task 127b8e07-fff9-66a4-e2a3-00000000001f 34052 1727204430.26533: variable 'ansible_search_path' from source: unknown 34052 1727204430.26536: variable 'ansible_search_path' from source: unknown 34052 1727204430.26572: calling self._execute() 34052 1727204430.26713: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.26717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.26722: variable 'omit' from source: magic vars 34052 1727204430.27236: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.27240: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.27455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204430.27771: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204430.27808: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204430.27909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204430.27947: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204430.28079: variable 'network_packages' from source: role '' defaults 34052 1727204430.28241: variable '__network_provider_setup' from source: role '' defaults 34052 1727204430.28262: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204430.28339: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204430.28347: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204430.28430: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204430.28687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204430.31522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204430.31591: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204430.31629: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204430.31663: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204430.31690: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204430.31795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.31807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.31835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.31877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.31892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.31942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.31972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.31999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.32084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.32088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.32276: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34052 1727204430.32414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.32446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.32467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.32506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.32519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.32626: variable 'ansible_python' from source: facts 34052 1727204430.32664: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34052 1727204430.32760: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204430.32898: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204430.33023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.33062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.33073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.33116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.33151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.33220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.33233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.33331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.33336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.33339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.33532: variable 'network_connections' from source: task vars 34052 1727204430.33637: variable 'interface' from source: play vars 34052 1727204430.33641: variable 'interface' from source: play vars 34052 1727204430.33722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204430.33759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204430.33783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.33812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204430.33941: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204430.34197: variable 'network_connections' from source: task vars 34052 1727204430.34200: variable 'interface' from source: play vars 34052 1727204430.34322: variable 'interface' from source: play vars 34052 1727204430.34354: variable '__network_packages_default_wireless' from source: role '' defaults 34052 1727204430.34471: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204430.34740: variable 'network_connections' from source: task vars 34052 1727204430.34743: variable 'interface' from source: play vars 34052 1727204430.34810: variable 'interface' from source: play vars 34052 1727204430.34840: variable '__network_packages_default_team' from source: role '' defaults 34052 1727204430.34923: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204430.35373: variable 'network_connections' from source: task vars 34052 1727204430.35377: variable 'interface' from source: play vars 34052 1727204430.35379: variable 'interface' from source: play vars 34052 1727204430.35382: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204430.35440: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204430.35446: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204430.35513: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204430.35853: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34052 1727204430.36344: variable 'network_connections' from source: task vars 34052 1727204430.36348: variable 'interface' from source: play vars 34052 1727204430.36418: variable 'interface' from source: play vars 34052 1727204430.36432: variable 'ansible_distribution' from source: facts 34052 1727204430.36435: variable '__network_rh_distros' from source: role '' defaults 34052 1727204430.36442: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.36469: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34052 1727204430.36663: variable 'ansible_distribution' from source: facts 34052 1727204430.36668: variable '__network_rh_distros' from source: role '' defaults 34052 1727204430.36675: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.36718: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34052 1727204430.36879: variable 'ansible_distribution' from source: facts 34052 1727204430.36883: variable '__network_rh_distros' from source: role '' defaults 34052 1727204430.36889: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.36944: variable 'network_provider' from source: set_fact 34052 1727204430.36960: variable 'ansible_facts' from source: unknown 34052 1727204430.37928: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34052 1727204430.37932: when evaluation is False, skipping this task 34052 1727204430.37935: _execute() done 34052 1727204430.37937: dumping result to json 34052 1727204430.37941: done dumping result, returning 34052 1727204430.37974: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-66a4-e2a3-00000000001f] 34052 1727204430.37978: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001f 34052 1727204430.38072: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000001f 34052 1727204430.38075: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34052 1727204430.38138: no more pending results, returning what we have 34052 1727204430.38142: results queue empty 34052 1727204430.38143: checking for any_errors_fatal 34052 1727204430.38150: done checking for any_errors_fatal 34052 1727204430.38151: checking for max_fail_percentage 34052 1727204430.38152: done checking for max_fail_percentage 34052 1727204430.38153: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.38154: done checking to see if all hosts have failed 34052 1727204430.38155: getting the remaining hosts for this loop 34052 1727204430.38157: done getting the remaining hosts for this loop 34052 1727204430.38161: getting the next task for host managed-node1 34052 1727204430.38170: done getting next task for host managed-node1 34052 1727204430.38174: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34052 1727204430.38177: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.38195: getting variables 34052 1727204430.38196: in VariableManager get_vars() 34052 1727204430.38242: Calling all_inventory to load vars for managed-node1 34052 1727204430.38245: Calling groups_inventory to load vars for managed-node1 34052 1727204430.38247: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.38260: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.38263: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.38390: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.40543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.42901: done with get_vars() 34052 1727204430.42950: done getting variables 34052 1727204430.43030: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.175) 0:00:16.747 ***** 34052 1727204430.43080: entering _queue_task() for managed-node1/package 34052 1727204430.43856: worker is 1 (out of 1 available) 34052 1727204430.44083: exiting _queue_task() for managed-node1/package 34052 1727204430.44098: done queuing things up, now waiting for results queue to drain 34052 1727204430.44100: waiting for pending results... 34052 1727204430.44691: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34052 1727204430.44697: in run() - task 127b8e07-fff9-66a4-e2a3-000000000020 34052 1727204430.44701: variable 'ansible_search_path' from source: unknown 34052 1727204430.44704: variable 'ansible_search_path' from source: unknown 34052 1727204430.44752: calling self._execute() 34052 1727204430.44895: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.44899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.44921: variable 'omit' from source: magic vars 34052 1727204430.45421: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.45436: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.45769: variable 'network_state' from source: role '' defaults 34052 1727204430.45774: Evaluated conditional (network_state != {}): False 34052 1727204430.45781: when evaluation is False, skipping this task 34052 1727204430.45785: _execute() done 34052 1727204430.45788: dumping result to json 34052 1727204430.45790: done dumping result, returning 34052 1727204430.45809: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-66a4-e2a3-000000000020] 34052 1727204430.45812: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000020 34052 1727204430.46166: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000020 34052 1727204430.46170: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204430.46329: no more pending results, returning what we have 34052 1727204430.46333: results queue empty 34052 1727204430.46334: checking for any_errors_fatal 34052 1727204430.46341: done checking for any_errors_fatal 34052 1727204430.46341: checking for max_fail_percentage 34052 1727204430.46343: done checking for max_fail_percentage 34052 1727204430.46344: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.46345: done checking to see if all hosts have failed 34052 1727204430.46346: getting the remaining hosts for this loop 34052 1727204430.46348: done getting the remaining hosts for this loop 34052 1727204430.46353: getting the next task for host managed-node1 34052 1727204430.46360: done getting next task for host managed-node1 34052 1727204430.46364: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34052 1727204430.46370: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.46392: getting variables 34052 1727204430.46397: in VariableManager get_vars() 34052 1727204430.46453: Calling all_inventory to load vars for managed-node1 34052 1727204430.46456: Calling groups_inventory to load vars for managed-node1 34052 1727204430.46459: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.46665: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.46671: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.46675: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.51218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.57555: done with get_vars() 34052 1727204430.57648: done getting variables 34052 1727204430.57772: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.147) 0:00:16.895 ***** 34052 1727204430.57814: entering _queue_task() for managed-node1/package 34052 1727204430.58504: worker is 1 (out of 1 available) 34052 1727204430.58516: exiting _queue_task() for managed-node1/package 34052 1727204430.58528: done queuing things up, now waiting for results queue to drain 34052 1727204430.58530: waiting for pending results... 34052 1727204430.58662: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34052 1727204430.58872: in run() - task 127b8e07-fff9-66a4-e2a3-000000000021 34052 1727204430.58876: variable 'ansible_search_path' from source: unknown 34052 1727204430.58879: variable 'ansible_search_path' from source: unknown 34052 1727204430.58882: calling self._execute() 34052 1727204430.58973: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.58991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.59008: variable 'omit' from source: magic vars 34052 1727204430.59643: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.59664: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.59899: variable 'network_state' from source: role '' defaults 34052 1727204430.59919: Evaluated conditional (network_state != {}): False 34052 1727204430.59928: when evaluation is False, skipping this task 34052 1727204430.59935: _execute() done 34052 1727204430.59943: dumping result to json 34052 1727204430.59955: done dumping result, returning 34052 1727204430.59979: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-66a4-e2a3-000000000021] 34052 1727204430.60061: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000021 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204430.60221: no more pending results, returning what we have 34052 1727204430.60225: results queue empty 34052 1727204430.60226: checking for any_errors_fatal 34052 1727204430.60237: done checking for any_errors_fatal 34052 1727204430.60238: checking for max_fail_percentage 34052 1727204430.60240: done checking for max_fail_percentage 34052 1727204430.60241: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.60242: done checking to see if all hosts have failed 34052 1727204430.60243: getting the remaining hosts for this loop 34052 1727204430.60245: done getting the remaining hosts for this loop 34052 1727204430.60250: getting the next task for host managed-node1 34052 1727204430.60258: done getting next task for host managed-node1 34052 1727204430.60262: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34052 1727204430.60272: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.60292: getting variables 34052 1727204430.60294: in VariableManager get_vars() 34052 1727204430.60410: Calling all_inventory to load vars for managed-node1 34052 1727204430.60415: Calling groups_inventory to load vars for managed-node1 34052 1727204430.60418: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.60436: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.60439: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.60444: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.61580: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000021 34052 1727204430.61590: WORKER PROCESS EXITING 34052 1727204430.65658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.71336: done with get_vars() 34052 1727204430.71381: done getting variables 34052 1727204430.71952: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.141) 0:00:17.037 ***** 34052 1727204430.71998: entering _queue_task() for managed-node1/service 34052 1727204430.72000: Creating lock for service 34052 1727204430.72775: worker is 1 (out of 1 available) 34052 1727204430.72792: exiting _queue_task() for managed-node1/service 34052 1727204430.72809: done queuing things up, now waiting for results queue to drain 34052 1727204430.72811: waiting for pending results... 34052 1727204430.73484: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34052 1727204430.74000: in run() - task 127b8e07-fff9-66a4-e2a3-000000000022 34052 1727204430.74018: variable 'ansible_search_path' from source: unknown 34052 1727204430.74022: variable 'ansible_search_path' from source: unknown 34052 1727204430.74063: calling self._execute() 34052 1727204430.74383: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.74390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.74402: variable 'omit' from source: magic vars 34052 1727204430.75346: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.75472: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.75721: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204430.76166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204430.81635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204430.83346: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204430.83574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204430.83583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204430.83585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204430.83882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.83911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.84052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.84099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.84115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.84170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.84310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.84341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.84415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.84431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.84632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204430.84659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204430.84692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.85071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204430.85075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204430.85301: variable 'network_connections' from source: task vars 34052 1727204430.85420: variable 'interface' from source: play vars 34052 1727204430.85533: variable 'interface' from source: play vars 34052 1727204430.85663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204430.86204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204430.86237: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204430.86377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204430.86380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204430.86518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204430.86631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204430.86635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204430.86693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204430.86878: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204430.87457: variable 'network_connections' from source: task vars 34052 1727204430.87462: variable 'interface' from source: play vars 34052 1727204430.87656: variable 'interface' from source: play vars 34052 1727204430.87697: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204430.87701: when evaluation is False, skipping this task 34052 1727204430.87703: _execute() done 34052 1727204430.87706: dumping result to json 34052 1727204430.87709: done dumping result, returning 34052 1727204430.87718: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-000000000022] 34052 1727204430.87720: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000022 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204430.88000: no more pending results, returning what we have 34052 1727204430.88004: results queue empty 34052 1727204430.88005: checking for any_errors_fatal 34052 1727204430.88013: done checking for any_errors_fatal 34052 1727204430.88014: checking for max_fail_percentage 34052 1727204430.88016: done checking for max_fail_percentage 34052 1727204430.88017: checking to see if all hosts have failed and the running result is not ok 34052 1727204430.88018: done checking to see if all hosts have failed 34052 1727204430.88019: getting the remaining hosts for this loop 34052 1727204430.88020: done getting the remaining hosts for this loop 34052 1727204430.88027: getting the next task for host managed-node1 34052 1727204430.88034: done getting next task for host managed-node1 34052 1727204430.88038: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34052 1727204430.88041: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204430.88058: getting variables 34052 1727204430.88059: in VariableManager get_vars() 34052 1727204430.88103: Calling all_inventory to load vars for managed-node1 34052 1727204430.88106: Calling groups_inventory to load vars for managed-node1 34052 1727204430.88108: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204430.88119: Calling all_plugins_play to load vars for managed-node1 34052 1727204430.88122: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204430.88127: Calling groups_plugins_play to load vars for managed-node1 34052 1727204430.88831: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000022 34052 1727204430.88835: WORKER PROCESS EXITING 34052 1727204430.93149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204430.97291: done with get_vars() 34052 1727204430.97337: done getting variables 34052 1727204430.97408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.254) 0:00:17.291 ***** 34052 1727204430.97446: entering _queue_task() for managed-node1/service 34052 1727204430.97829: worker is 1 (out of 1 available) 34052 1727204430.97844: exiting _queue_task() for managed-node1/service 34052 1727204430.97858: done queuing things up, now waiting for results queue to drain 34052 1727204430.97859: waiting for pending results... 34052 1727204430.98171: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34052 1727204430.98333: in run() - task 127b8e07-fff9-66a4-e2a3-000000000023 34052 1727204430.98359: variable 'ansible_search_path' from source: unknown 34052 1727204430.98368: variable 'ansible_search_path' from source: unknown 34052 1727204430.98415: calling self._execute() 34052 1727204430.98542: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204430.98555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204430.98572: variable 'omit' from source: magic vars 34052 1727204430.99018: variable 'ansible_distribution_major_version' from source: facts 34052 1727204430.99042: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204430.99241: variable 'network_provider' from source: set_fact 34052 1727204430.99251: variable 'network_state' from source: role '' defaults 34052 1727204430.99271: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34052 1727204430.99284: variable 'omit' from source: magic vars 34052 1727204430.99356: variable 'omit' from source: magic vars 34052 1727204430.99397: variable 'network_service_name' from source: role '' defaults 34052 1727204430.99663: variable 'network_service_name' from source: role '' defaults 34052 1727204430.99669: variable '__network_provider_setup' from source: role '' defaults 34052 1727204430.99672: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204430.99908: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204430.99921: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204431.00034: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204431.00474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204431.03807: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204431.04029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204431.04084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204431.04186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204431.04220: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204431.04483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204431.04521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204431.04769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.04774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204431.04777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204431.04983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204431.04987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204431.05013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.05076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204431.05144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204431.05445: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34052 1727204431.05588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204431.05621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204431.05662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.05712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204431.05737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204431.05845: variable 'ansible_python' from source: facts 34052 1727204431.05887: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34052 1727204431.05995: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204431.06091: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204431.06243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204431.06277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204431.06312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.06399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204431.06403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204431.06447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204431.06488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204431.06523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.06616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204431.06620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204431.06759: variable 'network_connections' from source: task vars 34052 1727204431.06775: variable 'interface' from source: play vars 34052 1727204431.06861: variable 'interface' from source: play vars 34052 1727204431.06990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204431.07213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204431.07572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204431.07576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204431.07579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204431.07649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204431.07871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204431.07875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204431.07889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204431.07952: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204431.08701: variable 'network_connections' from source: task vars 34052 1727204431.08785: variable 'interface' from source: play vars 34052 1727204431.09002: variable 'interface' from source: play vars 34052 1727204431.09104: variable '__network_packages_default_wireless' from source: role '' defaults 34052 1727204431.09330: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204431.10023: variable 'network_connections' from source: task vars 34052 1727204431.10093: variable 'interface' from source: play vars 34052 1727204431.10301: variable 'interface' from source: play vars 34052 1727204431.10340: variable '__network_packages_default_team' from source: role '' defaults 34052 1727204431.10736: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204431.11170: variable 'network_connections' from source: task vars 34052 1727204431.11296: variable 'interface' from source: play vars 34052 1727204431.11388: variable 'interface' from source: play vars 34052 1727204431.11612: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204431.11937: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204431.11941: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204431.12153: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204431.12517: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34052 1727204431.13683: variable 'network_connections' from source: task vars 34052 1727204431.13812: variable 'interface' from source: play vars 34052 1727204431.14017: variable 'interface' from source: play vars 34052 1727204431.14027: variable 'ansible_distribution' from source: facts 34052 1727204431.14038: variable '__network_rh_distros' from source: role '' defaults 34052 1727204431.14049: variable 'ansible_distribution_major_version' from source: facts 34052 1727204431.14342: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34052 1727204431.14608: variable 'ansible_distribution' from source: facts 34052 1727204431.14619: variable '__network_rh_distros' from source: role '' defaults 34052 1727204431.14634: variable 'ansible_distribution_major_version' from source: facts 34052 1727204431.14648: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34052 1727204431.14871: variable 'ansible_distribution' from source: facts 34052 1727204431.14885: variable '__network_rh_distros' from source: role '' defaults 34052 1727204431.14896: variable 'ansible_distribution_major_version' from source: facts 34052 1727204431.14942: variable 'network_provider' from source: set_fact 34052 1727204431.14976: variable 'omit' from source: magic vars 34052 1727204431.15013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204431.15051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204431.15077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204431.15103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204431.15116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204431.15153: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204431.15161: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204431.15171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204431.15287: Set connection var ansible_connection to ssh 34052 1727204431.15300: Set connection var ansible_timeout to 10 34052 1727204431.15312: Set connection var ansible_pipelining to False 34052 1727204431.15321: Set connection var ansible_shell_type to sh 34052 1727204431.15336: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204431.15347: Set connection var ansible_shell_executable to /bin/sh 34052 1727204431.15380: variable 'ansible_shell_executable' from source: unknown 34052 1727204431.15388: variable 'ansible_connection' from source: unknown 34052 1727204431.15395: variable 'ansible_module_compression' from source: unknown 34052 1727204431.15402: variable 'ansible_shell_type' from source: unknown 34052 1727204431.15429: variable 'ansible_shell_executable' from source: unknown 34052 1727204431.15433: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204431.15435: variable 'ansible_pipelining' from source: unknown 34052 1727204431.15437: variable 'ansible_timeout' from source: unknown 34052 1727204431.15439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204431.15646: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204431.15650: variable 'omit' from source: magic vars 34052 1727204431.15657: starting attempt loop 34052 1727204431.15659: running the handler 34052 1727204431.15705: variable 'ansible_facts' from source: unknown 34052 1727204431.16914: _low_level_execute_command(): starting 34052 1727204431.16934: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204431.17728: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204431.17750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204431.17772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204431.17839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204431.17897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204431.17915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204431.17945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204431.18071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204431.19959: stdout chunk (state=3): >>>/root <<< 34052 1727204431.20012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204431.20204: stderr chunk (state=3): >>><<< 34052 1727204431.20210: stdout chunk (state=3): >>><<< 34052 1727204431.20342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204431.20346: _low_level_execute_command(): starting 34052 1727204431.20349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370 `" && echo ansible-tmp-1727204431.202496-35135-195162883143370="` echo /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370 `" ) && sleep 0' 34052 1727204431.21009: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204431.21029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204431.21046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204431.21091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204431.21105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204431.21205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204431.21234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204431.21327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204431.23720: stdout chunk (state=3): >>>ansible-tmp-1727204431.202496-35135-195162883143370=/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370 <<< 34052 1727204431.23821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204431.23876: stderr chunk (state=3): >>><<< 34052 1727204431.23895: stdout chunk (state=3): >>><<< 34052 1727204431.23933: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204431.202496-35135-195162883143370=/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204431.23984: variable 'ansible_module_compression' from source: unknown 34052 1727204431.24063: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 34052 1727204431.24074: ANSIBALLZ: Acquiring lock 34052 1727204431.24082: ANSIBALLZ: Lock acquired: 140141530567488 34052 1727204431.24091: ANSIBALLZ: Creating module 34052 1727204432.02363: ANSIBALLZ: Writing module into payload 34052 1727204432.02441: ANSIBALLZ: Writing module 34052 1727204432.02489: ANSIBALLZ: Renaming module 34052 1727204432.02495: ANSIBALLZ: Done creating module 34052 1727204432.02519: variable 'ansible_facts' from source: unknown 34052 1727204432.02771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py 34052 1727204432.02929: Sending initial data 34052 1727204432.02932: Sent initial data (155 bytes) 34052 1727204432.03840: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204432.03999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204432.04004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.04008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204432.04011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.04013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.04195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.05986: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204432.06030: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204432.06083: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py" <<< 34052 1727204432.06201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpqs77ssvo /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py <<< 34052 1727204432.06205: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpqs77ssvo" to remote "/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py" <<< 34052 1727204432.09737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204432.09848: stderr chunk (state=3): >>><<< 34052 1727204432.09952: stdout chunk (state=3): >>><<< 34052 1727204432.09955: done transferring module to remote 34052 1727204432.09958: _low_level_execute_command(): starting 34052 1727204432.09960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/ /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py && sleep 0' 34052 1727204432.11071: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.11076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.11178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.11233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.11293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.13390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204432.13604: stderr chunk (state=3): >>><<< 34052 1727204432.13609: stdout chunk (state=3): >>><<< 34052 1727204432.13673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204432.13677: _low_level_execute_command(): starting 34052 1727204432.13681: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/AnsiballZ_systemd.py && sleep 0' 34052 1727204432.14890: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204432.14924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204432.15007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.15045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204432.15063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.15287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.15458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.48755: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "673", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ExecMainStartTimestampMonotonic": "31464158", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "673", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11902976", "MemoryPeak": "13578240", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3525267456", "CPUUsageNSec": "1704859000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 34052 1727204432.48783: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target", "After": "systemd-journald.socket dbus-broker.service basic.target network-pre.target sysinit.target dbus.socket system.slice cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:13 EDT", "StateChangeTimestampMonotonic": "359196339", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:44 EDT", "InactiveExitTimestampMonotonic": "31464340", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:46 EDT", "ActiveEnterTimestampMonotonic": "32958713", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ConditionTimestampMonotonic": "31456341", "AssertTimestamp": "Tue 2024-09-24 14:48:44 EDT", "AssertTimestampMonotonic": "31456345", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "0b953f6a210e485cbebf0a8e98fe18d8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34052 1727204432.50954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204432.50984: stdout chunk (state=3): >>><<< 34052 1727204432.51002: stderr chunk (state=3): >>><<< 34052 1727204432.51025: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "673", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ExecMainStartTimestampMonotonic": "31464158", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "673", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11902976", "MemoryPeak": "13578240", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3525267456", "CPUUsageNSec": "1704859000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target", "After": "systemd-journald.socket dbus-broker.service basic.target network-pre.target sysinit.target dbus.socket system.slice cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:13 EDT", "StateChangeTimestampMonotonic": "359196339", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:44 EDT", "InactiveExitTimestampMonotonic": "31464340", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:46 EDT", "ActiveEnterTimestampMonotonic": "32958713", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ConditionTimestampMonotonic": "31456341", "AssertTimestamp": "Tue 2024-09-24 14:48:44 EDT", "AssertTimestampMonotonic": "31456345", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "0b953f6a210e485cbebf0a8e98fe18d8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204432.51373: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204432.51378: _low_level_execute_command(): starting 34052 1727204432.51380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204431.202496-35135-195162883143370/ > /dev/null 2>&1 && sleep 0' 34052 1727204432.52110: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204432.52147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204432.52161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.52200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.52238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204432.52250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.52262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.52346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204432.52362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.52367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.52412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.54591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204432.54595: stdout chunk (state=3): >>><<< 34052 1727204432.54598: stderr chunk (state=3): >>><<< 34052 1727204432.54600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204432.54603: handler run complete 34052 1727204432.54610: attempt loop complete, returning result 34052 1727204432.54622: _execute() done 34052 1727204432.54625: dumping result to json 34052 1727204432.54649: done dumping result, returning 34052 1727204432.54659: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-66a4-e2a3-000000000023] 34052 1727204432.54662: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000023 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204432.55189: no more pending results, returning what we have 34052 1727204432.55193: results queue empty 34052 1727204432.55194: checking for any_errors_fatal 34052 1727204432.55202: done checking for any_errors_fatal 34052 1727204432.55203: checking for max_fail_percentage 34052 1727204432.55205: done checking for max_fail_percentage 34052 1727204432.55206: checking to see if all hosts have failed and the running result is not ok 34052 1727204432.55208: done checking to see if all hosts have failed 34052 1727204432.55209: getting the remaining hosts for this loop 34052 1727204432.55211: done getting the remaining hosts for this loop 34052 1727204432.55216: getting the next task for host managed-node1 34052 1727204432.55222: done getting next task for host managed-node1 34052 1727204432.55229: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34052 1727204432.55233: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204432.55246: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000023 34052 1727204432.55250: WORKER PROCESS EXITING 34052 1727204432.55371: getting variables 34052 1727204432.55383: in VariableManager get_vars() 34052 1727204432.55449: Calling all_inventory to load vars for managed-node1 34052 1727204432.55452: Calling groups_inventory to load vars for managed-node1 34052 1727204432.55454: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204432.55476: Calling all_plugins_play to load vars for managed-node1 34052 1727204432.55481: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204432.55486: Calling groups_plugins_play to load vars for managed-node1 34052 1727204432.56686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204432.58356: done with get_vars() 34052 1727204432.58413: done getting variables 34052 1727204432.58484: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:00:32 -0400 (0:00:01.610) 0:00:18.902 ***** 34052 1727204432.58525: entering _queue_task() for managed-node1/service 34052 1727204432.58930: worker is 1 (out of 1 available) 34052 1727204432.58944: exiting _queue_task() for managed-node1/service 34052 1727204432.58958: done queuing things up, now waiting for results queue to drain 34052 1727204432.58959: waiting for pending results... 34052 1727204432.59237: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34052 1727204432.59399: in run() - task 127b8e07-fff9-66a4-e2a3-000000000024 34052 1727204432.59437: variable 'ansible_search_path' from source: unknown 34052 1727204432.59441: variable 'ansible_search_path' from source: unknown 34052 1727204432.59470: calling self._execute() 34052 1727204432.59554: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.59559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.59576: variable 'omit' from source: magic vars 34052 1727204432.59911: variable 'ansible_distribution_major_version' from source: facts 34052 1727204432.59921: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204432.60011: variable 'network_provider' from source: set_fact 34052 1727204432.60018: Evaluated conditional (network_provider == "nm"): True 34052 1727204432.60093: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204432.60163: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204432.60303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204432.62464: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204432.62536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204432.62566: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204432.62591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204432.62611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204432.62785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204432.62789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204432.62792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204432.62801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204432.62821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204432.62862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204432.62884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204432.62911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204432.62948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204432.62959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204432.63007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204432.63030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204432.63047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204432.63107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204432.63135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204432.63269: variable 'network_connections' from source: task vars 34052 1727204432.63273: variable 'interface' from source: play vars 34052 1727204432.63357: variable 'interface' from source: play vars 34052 1727204432.63448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204432.63617: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204432.63662: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204432.63696: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204432.63730: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204432.63779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204432.63803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204432.63830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204432.63854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204432.63992: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204432.64184: variable 'network_connections' from source: task vars 34052 1727204432.64190: variable 'interface' from source: play vars 34052 1727204432.64261: variable 'interface' from source: play vars 34052 1727204432.64313: Evaluated conditional (__network_wpa_supplicant_required): False 34052 1727204432.64317: when evaluation is False, skipping this task 34052 1727204432.64320: _execute() done 34052 1727204432.64322: dumping result to json 34052 1727204432.64324: done dumping result, returning 34052 1727204432.64332: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-66a4-e2a3-000000000024] 34052 1727204432.64345: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000024 34052 1727204432.64546: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000024 34052 1727204432.64549: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34052 1727204432.64621: no more pending results, returning what we have 34052 1727204432.64623: results queue empty 34052 1727204432.64624: checking for any_errors_fatal 34052 1727204432.64648: done checking for any_errors_fatal 34052 1727204432.64649: checking for max_fail_percentage 34052 1727204432.64651: done checking for max_fail_percentage 34052 1727204432.64651: checking to see if all hosts have failed and the running result is not ok 34052 1727204432.64652: done checking to see if all hosts have failed 34052 1727204432.64653: getting the remaining hosts for this loop 34052 1727204432.64654: done getting the remaining hosts for this loop 34052 1727204432.64658: getting the next task for host managed-node1 34052 1727204432.64664: done getting next task for host managed-node1 34052 1727204432.64669: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34052 1727204432.64672: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204432.64687: getting variables 34052 1727204432.64688: in VariableManager get_vars() 34052 1727204432.64752: Calling all_inventory to load vars for managed-node1 34052 1727204432.64756: Calling groups_inventory to load vars for managed-node1 34052 1727204432.64759: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204432.64774: Calling all_plugins_play to load vars for managed-node1 34052 1727204432.64776: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204432.64779: Calling groups_plugins_play to load vars for managed-node1 34052 1727204432.65791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204432.67294: done with get_vars() 34052 1727204432.67332: done getting variables 34052 1727204432.67402: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.089) 0:00:18.991 ***** 34052 1727204432.67437: entering _queue_task() for managed-node1/service 34052 1727204432.67823: worker is 1 (out of 1 available) 34052 1727204432.67838: exiting _queue_task() for managed-node1/service 34052 1727204432.67853: done queuing things up, now waiting for results queue to drain 34052 1727204432.67854: waiting for pending results... 34052 1727204432.68140: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 34052 1727204432.68239: in run() - task 127b8e07-fff9-66a4-e2a3-000000000025 34052 1727204432.68254: variable 'ansible_search_path' from source: unknown 34052 1727204432.68257: variable 'ansible_search_path' from source: unknown 34052 1727204432.68295: calling self._execute() 34052 1727204432.68374: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.68386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.68389: variable 'omit' from source: magic vars 34052 1727204432.68702: variable 'ansible_distribution_major_version' from source: facts 34052 1727204432.68713: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204432.68799: variable 'network_provider' from source: set_fact 34052 1727204432.68804: Evaluated conditional (network_provider == "initscripts"): False 34052 1727204432.68809: when evaluation is False, skipping this task 34052 1727204432.68812: _execute() done 34052 1727204432.68816: dumping result to json 34052 1727204432.68819: done dumping result, returning 34052 1727204432.68832: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-66a4-e2a3-000000000025] 34052 1727204432.68835: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000025 34052 1727204432.68924: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000025 34052 1727204432.68936: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204432.69001: no more pending results, returning what we have 34052 1727204432.69005: results queue empty 34052 1727204432.69006: checking for any_errors_fatal 34052 1727204432.69016: done checking for any_errors_fatal 34052 1727204432.69017: checking for max_fail_percentage 34052 1727204432.69019: done checking for max_fail_percentage 34052 1727204432.69020: checking to see if all hosts have failed and the running result is not ok 34052 1727204432.69021: done checking to see if all hosts have failed 34052 1727204432.69021: getting the remaining hosts for this loop 34052 1727204432.69023: done getting the remaining hosts for this loop 34052 1727204432.69030: getting the next task for host managed-node1 34052 1727204432.69042: done getting next task for host managed-node1 34052 1727204432.69047: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34052 1727204432.69051: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204432.69070: getting variables 34052 1727204432.69072: in VariableManager get_vars() 34052 1727204432.69111: Calling all_inventory to load vars for managed-node1 34052 1727204432.69114: Calling groups_inventory to load vars for managed-node1 34052 1727204432.69116: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204432.69128: Calling all_plugins_play to load vars for managed-node1 34052 1727204432.69131: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204432.69134: Calling groups_plugins_play to load vars for managed-node1 34052 1727204432.70634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204432.72814: done with get_vars() 34052 1727204432.72859: done getting variables 34052 1727204432.72938: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.055) 0:00:19.046 ***** 34052 1727204432.72981: entering _queue_task() for managed-node1/copy 34052 1727204432.73331: worker is 1 (out of 1 available) 34052 1727204432.73347: exiting _queue_task() for managed-node1/copy 34052 1727204432.73362: done queuing things up, now waiting for results queue to drain 34052 1727204432.73364: waiting for pending results... 34052 1727204432.73839: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34052 1727204432.73845: in run() - task 127b8e07-fff9-66a4-e2a3-000000000026 34052 1727204432.73849: variable 'ansible_search_path' from source: unknown 34052 1727204432.73851: variable 'ansible_search_path' from source: unknown 34052 1727204432.73977: calling self._execute() 34052 1727204432.74153: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.74158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.74161: variable 'omit' from source: magic vars 34052 1727204432.74479: variable 'ansible_distribution_major_version' from source: facts 34052 1727204432.74494: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204432.74638: variable 'network_provider' from source: set_fact 34052 1727204432.74651: Evaluated conditional (network_provider == "initscripts"): False 34052 1727204432.74654: when evaluation is False, skipping this task 34052 1727204432.74657: _execute() done 34052 1727204432.74660: dumping result to json 34052 1727204432.74664: done dumping result, returning 34052 1727204432.74678: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-66a4-e2a3-000000000026] 34052 1727204432.74681: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000026 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34052 1727204432.74911: no more pending results, returning what we have 34052 1727204432.74917: results queue empty 34052 1727204432.74918: checking for any_errors_fatal 34052 1727204432.74927: done checking for any_errors_fatal 34052 1727204432.74928: checking for max_fail_percentage 34052 1727204432.74930: done checking for max_fail_percentage 34052 1727204432.74931: checking to see if all hosts have failed and the running result is not ok 34052 1727204432.74932: done checking to see if all hosts have failed 34052 1727204432.74933: getting the remaining hosts for this loop 34052 1727204432.74935: done getting the remaining hosts for this loop 34052 1727204432.74940: getting the next task for host managed-node1 34052 1727204432.74947: done getting next task for host managed-node1 34052 1727204432.74953: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34052 1727204432.74957: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204432.74981: getting variables 34052 1727204432.74984: in VariableManager get_vars() 34052 1727204432.75034: Calling all_inventory to load vars for managed-node1 34052 1727204432.75038: Calling groups_inventory to load vars for managed-node1 34052 1727204432.75041: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204432.75059: Calling all_plugins_play to load vars for managed-node1 34052 1727204432.75063: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204432.75174: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000026 34052 1727204432.75182: Calling groups_plugins_play to load vars for managed-node1 34052 1727204432.75709: WORKER PROCESS EXITING 34052 1727204432.76640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204432.77863: done with get_vars() 34052 1727204432.77899: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.049) 0:00:19.096 ***** 34052 1727204432.77973: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34052 1727204432.77975: Creating lock for fedora.linux_system_roles.network_connections 34052 1727204432.78278: worker is 1 (out of 1 available) 34052 1727204432.78295: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34052 1727204432.78309: done queuing things up, now waiting for results queue to drain 34052 1727204432.78311: waiting for pending results... 34052 1727204432.78511: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34052 1727204432.78614: in run() - task 127b8e07-fff9-66a4-e2a3-000000000027 34052 1727204432.78627: variable 'ansible_search_path' from source: unknown 34052 1727204432.78632: variable 'ansible_search_path' from source: unknown 34052 1727204432.78675: calling self._execute() 34052 1727204432.78758: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.78764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.78776: variable 'omit' from source: magic vars 34052 1727204432.79096: variable 'ansible_distribution_major_version' from source: facts 34052 1727204432.79108: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204432.79115: variable 'omit' from source: magic vars 34052 1727204432.79164: variable 'omit' from source: magic vars 34052 1727204432.79303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204432.81005: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204432.81064: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204432.81092: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204432.81119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204432.81143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204432.81217: variable 'network_provider' from source: set_fact 34052 1727204432.81328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204432.81366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204432.81390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204432.81418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204432.81432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204432.81498: variable 'omit' from source: magic vars 34052 1727204432.81589: variable 'omit' from source: magic vars 34052 1727204432.81670: variable 'network_connections' from source: task vars 34052 1727204432.81680: variable 'interface' from source: play vars 34052 1727204432.81735: variable 'interface' from source: play vars 34052 1727204432.81860: variable 'omit' from source: magic vars 34052 1727204432.81868: variable '__lsr_ansible_managed' from source: task vars 34052 1727204432.81913: variable '__lsr_ansible_managed' from source: task vars 34052 1727204432.82137: Loaded config def from plugin (lookup/template) 34052 1727204432.82141: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34052 1727204432.82168: File lookup term: get_ansible_managed.j2 34052 1727204432.82171: variable 'ansible_search_path' from source: unknown 34052 1727204432.82176: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34052 1727204432.82188: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34052 1727204432.82202: variable 'ansible_search_path' from source: unknown 34052 1727204432.86599: variable 'ansible_managed' from source: unknown 34052 1727204432.86712: variable 'omit' from source: magic vars 34052 1727204432.86739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204432.86762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204432.86780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204432.86795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204432.86810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204432.86833: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204432.86836: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.86839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.86906: Set connection var ansible_connection to ssh 34052 1727204432.86914: Set connection var ansible_timeout to 10 34052 1727204432.86917: Set connection var ansible_pipelining to False 34052 1727204432.86920: Set connection var ansible_shell_type to sh 34052 1727204432.86937: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204432.86940: Set connection var ansible_shell_executable to /bin/sh 34052 1727204432.86960: variable 'ansible_shell_executable' from source: unknown 34052 1727204432.86963: variable 'ansible_connection' from source: unknown 34052 1727204432.86968: variable 'ansible_module_compression' from source: unknown 34052 1727204432.86970: variable 'ansible_shell_type' from source: unknown 34052 1727204432.86973: variable 'ansible_shell_executable' from source: unknown 34052 1727204432.86975: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204432.86979: variable 'ansible_pipelining' from source: unknown 34052 1727204432.86983: variable 'ansible_timeout' from source: unknown 34052 1727204432.86986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204432.87098: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204432.87110: variable 'omit' from source: magic vars 34052 1727204432.87116: starting attempt loop 34052 1727204432.87120: running the handler 34052 1727204432.87134: _low_level_execute_command(): starting 34052 1727204432.87146: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204432.87719: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.87723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.87729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204432.87733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.87774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204432.87777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.87780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.87851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.89623: stdout chunk (state=3): >>>/root <<< 34052 1727204432.89725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204432.89798: stderr chunk (state=3): >>><<< 34052 1727204432.89802: stdout chunk (state=3): >>><<< 34052 1727204432.89821: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204432.89833: _low_level_execute_command(): starting 34052 1727204432.89841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537 `" && echo ansible-tmp-1727204432.8982215-35207-217543816035537="` echo /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537 `" ) && sleep 0' 34052 1727204432.90370: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.90375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.90377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204432.90379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204432.90440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204432.90444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204432.90505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204432.92598: stdout chunk (state=3): >>>ansible-tmp-1727204432.8982215-35207-217543816035537=/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537 <<< 34052 1727204432.92714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204432.92781: stderr chunk (state=3): >>><<< 34052 1727204432.92784: stdout chunk (state=3): >>><<< 34052 1727204432.92802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204432.8982215-35207-217543816035537=/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204432.92848: variable 'ansible_module_compression' from source: unknown 34052 1727204432.92893: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 34052 1727204432.92896: ANSIBALLZ: Acquiring lock 34052 1727204432.92900: ANSIBALLZ: Lock acquired: 140141528711728 34052 1727204432.92904: ANSIBALLZ: Creating module 34052 1727204433.13661: ANSIBALLZ: Writing module into payload 34052 1727204433.13973: ANSIBALLZ: Writing module 34052 1727204433.14073: ANSIBALLZ: Renaming module 34052 1727204433.14076: ANSIBALLZ: Done creating module 34052 1727204433.14079: variable 'ansible_facts' from source: unknown 34052 1727204433.14147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py 34052 1727204433.14403: Sending initial data 34052 1727204433.14406: Sent initial data (168 bytes) 34052 1727204433.14996: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204433.15086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204433.15126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204433.15145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204433.15175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204433.15271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204433.16988: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204433.17048: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204433.17102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204433.17182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpg9vw3gtz /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py <<< 34052 1727204433.17186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py" <<< 34052 1727204433.17219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpg9vw3gtz" to remote "/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py" <<< 34052 1727204433.18715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204433.18823: stderr chunk (state=3): >>><<< 34052 1727204433.18830: stdout chunk (state=3): >>><<< 34052 1727204433.18836: done transferring module to remote 34052 1727204433.18853: _low_level_execute_command(): starting 34052 1727204433.18862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/ /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py && sleep 0' 34052 1727204433.19573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204433.19583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204433.19586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204433.19593: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204433.19645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204433.19692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204433.21690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204433.21726: stderr chunk (state=3): >>><<< 34052 1727204433.21736: stdout chunk (state=3): >>><<< 34052 1727204433.21844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204433.21848: _low_level_execute_command(): starting 34052 1727204433.21850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/AnsiballZ_network_connections.py && sleep 0' 34052 1727204433.22425: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204433.22445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204433.22463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204433.22486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204433.22503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204433.22604: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204433.22633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204433.22732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204435.25091: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34052 1727204435.27133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204435.27201: stderr chunk (state=3): >>><<< 34052 1727204435.27205: stdout chunk (state=3): >>><<< 34052 1727204435.27221: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204435.27267: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204435.27275: _low_level_execute_command(): starting 34052 1727204435.27281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204432.8982215-35207-217543816035537/ > /dev/null 2>&1 && sleep 0' 34052 1727204435.27773: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204435.27777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204435.27786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204435.27804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204435.27844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204435.27847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204435.27858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204435.27915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204435.29992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204435.30052: stderr chunk (state=3): >>><<< 34052 1727204435.30058: stdout chunk (state=3): >>><<< 34052 1727204435.30074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204435.30083: handler run complete 34052 1727204435.30107: attempt loop complete, returning result 34052 1727204435.30110: _execute() done 34052 1727204435.30113: dumping result to json 34052 1727204435.30119: done dumping result, returning 34052 1727204435.30129: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-66a4-e2a3-000000000027] 34052 1727204435.30134: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000027 34052 1727204435.30255: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000027 34052 1727204435.30259: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active) 34052 1727204435.30385: no more pending results, returning what we have 34052 1727204435.30388: results queue empty 34052 1727204435.30389: checking for any_errors_fatal 34052 1727204435.30397: done checking for any_errors_fatal 34052 1727204435.30398: checking for max_fail_percentage 34052 1727204435.30399: done checking for max_fail_percentage 34052 1727204435.30400: checking to see if all hosts have failed and the running result is not ok 34052 1727204435.30401: done checking to see if all hosts have failed 34052 1727204435.30402: getting the remaining hosts for this loop 34052 1727204435.30403: done getting the remaining hosts for this loop 34052 1727204435.30408: getting the next task for host managed-node1 34052 1727204435.30414: done getting next task for host managed-node1 34052 1727204435.30418: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34052 1727204435.30421: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204435.30433: getting variables 34052 1727204435.30434: in VariableManager get_vars() 34052 1727204435.30483: Calling all_inventory to load vars for managed-node1 34052 1727204435.30486: Calling groups_inventory to load vars for managed-node1 34052 1727204435.30489: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204435.30499: Calling all_plugins_play to load vars for managed-node1 34052 1727204435.30502: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204435.30505: Calling groups_plugins_play to load vars for managed-node1 34052 1727204435.31592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204435.32806: done with get_vars() 34052 1727204435.32838: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:00:35 -0400 (0:00:02.549) 0:00:21.646 ***** 34052 1727204435.32914: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34052 1727204435.32916: Creating lock for fedora.linux_system_roles.network_state 34052 1727204435.33211: worker is 1 (out of 1 available) 34052 1727204435.33226: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34052 1727204435.33240: done queuing things up, now waiting for results queue to drain 34052 1727204435.33242: waiting for pending results... 34052 1727204435.33437: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 34052 1727204435.33545: in run() - task 127b8e07-fff9-66a4-e2a3-000000000028 34052 1727204435.33560: variable 'ansible_search_path' from source: unknown 34052 1727204435.33564: variable 'ansible_search_path' from source: unknown 34052 1727204435.33600: calling self._execute() 34052 1727204435.33680: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.33684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.33694: variable 'omit' from source: magic vars 34052 1727204435.34004: variable 'ansible_distribution_major_version' from source: facts 34052 1727204435.34027: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204435.34116: variable 'network_state' from source: role '' defaults 34052 1727204435.34133: Evaluated conditional (network_state != {}): False 34052 1727204435.34138: when evaluation is False, skipping this task 34052 1727204435.34141: _execute() done 34052 1727204435.34144: dumping result to json 34052 1727204435.34146: done dumping result, returning 34052 1727204435.34149: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-66a4-e2a3-000000000028] 34052 1727204435.34152: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000028 34052 1727204435.34252: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000028 34052 1727204435.34255: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204435.34312: no more pending results, returning what we have 34052 1727204435.34316: results queue empty 34052 1727204435.34317: checking for any_errors_fatal 34052 1727204435.34335: done checking for any_errors_fatal 34052 1727204435.34336: checking for max_fail_percentage 34052 1727204435.34338: done checking for max_fail_percentage 34052 1727204435.34339: checking to see if all hosts have failed and the running result is not ok 34052 1727204435.34340: done checking to see if all hosts have failed 34052 1727204435.34340: getting the remaining hosts for this loop 34052 1727204435.34342: done getting the remaining hosts for this loop 34052 1727204435.34346: getting the next task for host managed-node1 34052 1727204435.34352: done getting next task for host managed-node1 34052 1727204435.34356: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34052 1727204435.34359: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204435.34378: getting variables 34052 1727204435.34379: in VariableManager get_vars() 34052 1727204435.34418: Calling all_inventory to load vars for managed-node1 34052 1727204435.34424: Calling groups_inventory to load vars for managed-node1 34052 1727204435.34427: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204435.34437: Calling all_plugins_play to load vars for managed-node1 34052 1727204435.34440: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204435.34442: Calling groups_plugins_play to load vars for managed-node1 34052 1727204435.35629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204435.36953: done with get_vars() 34052 1727204435.36993: done getting variables 34052 1727204435.37065: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:00:35 -0400 (0:00:00.041) 0:00:21.688 ***** 34052 1727204435.37106: entering _queue_task() for managed-node1/debug 34052 1727204435.37488: worker is 1 (out of 1 available) 34052 1727204435.37502: exiting _queue_task() for managed-node1/debug 34052 1727204435.37517: done queuing things up, now waiting for results queue to drain 34052 1727204435.37518: waiting for pending results... 34052 1727204435.37894: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34052 1727204435.38025: in run() - task 127b8e07-fff9-66a4-e2a3-000000000029 34052 1727204435.38029: variable 'ansible_search_path' from source: unknown 34052 1727204435.38032: variable 'ansible_search_path' from source: unknown 34052 1727204435.38070: calling self._execute() 34052 1727204435.38152: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.38159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.38171: variable 'omit' from source: magic vars 34052 1727204435.38484: variable 'ansible_distribution_major_version' from source: facts 34052 1727204435.38495: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204435.38504: variable 'omit' from source: magic vars 34052 1727204435.38553: variable 'omit' from source: magic vars 34052 1727204435.38583: variable 'omit' from source: magic vars 34052 1727204435.38620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204435.38654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204435.38673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204435.38690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.38700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.38729: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204435.38734: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.38737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.38814: Set connection var ansible_connection to ssh 34052 1727204435.38821: Set connection var ansible_timeout to 10 34052 1727204435.38831: Set connection var ansible_pipelining to False 34052 1727204435.38834: Set connection var ansible_shell_type to sh 34052 1727204435.38841: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204435.38848: Set connection var ansible_shell_executable to /bin/sh 34052 1727204435.38872: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.38875: variable 'ansible_connection' from source: unknown 34052 1727204435.38878: variable 'ansible_module_compression' from source: unknown 34052 1727204435.38881: variable 'ansible_shell_type' from source: unknown 34052 1727204435.38883: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.38886: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.38888: variable 'ansible_pipelining' from source: unknown 34052 1727204435.38892: variable 'ansible_timeout' from source: unknown 34052 1727204435.38896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.39017: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204435.39026: variable 'omit' from source: magic vars 34052 1727204435.39038: starting attempt loop 34052 1727204435.39041: running the handler 34052 1727204435.39153: variable '__network_connections_result' from source: set_fact 34052 1727204435.39201: handler run complete 34052 1727204435.39216: attempt loop complete, returning result 34052 1727204435.39219: _execute() done 34052 1727204435.39222: dumping result to json 34052 1727204435.39225: done dumping result, returning 34052 1727204435.39237: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-66a4-e2a3-000000000029] 34052 1727204435.39241: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000029 34052 1727204435.39337: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000029 34052 1727204435.39340: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active)" ] } 34052 1727204435.39410: no more pending results, returning what we have 34052 1727204435.39414: results queue empty 34052 1727204435.39415: checking for any_errors_fatal 34052 1727204435.39423: done checking for any_errors_fatal 34052 1727204435.39424: checking for max_fail_percentage 34052 1727204435.39425: done checking for max_fail_percentage 34052 1727204435.39426: checking to see if all hosts have failed and the running result is not ok 34052 1727204435.39427: done checking to see if all hosts have failed 34052 1727204435.39428: getting the remaining hosts for this loop 34052 1727204435.39429: done getting the remaining hosts for this loop 34052 1727204435.39434: getting the next task for host managed-node1 34052 1727204435.39440: done getting next task for host managed-node1 34052 1727204435.39444: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34052 1727204435.39447: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204435.39461: getting variables 34052 1727204435.39462: in VariableManager get_vars() 34052 1727204435.39511: Calling all_inventory to load vars for managed-node1 34052 1727204435.39514: Calling groups_inventory to load vars for managed-node1 34052 1727204435.39516: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204435.39526: Calling all_plugins_play to load vars for managed-node1 34052 1727204435.39528: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204435.39532: Calling groups_plugins_play to load vars for managed-node1 34052 1727204435.41458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204435.43596: done with get_vars() 34052 1727204435.43639: done getting variables 34052 1727204435.43711: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:00:35 -0400 (0:00:00.066) 0:00:21.754 ***** 34052 1727204435.43747: entering _queue_task() for managed-node1/debug 34052 1727204435.44128: worker is 1 (out of 1 available) 34052 1727204435.44143: exiting _queue_task() for managed-node1/debug 34052 1727204435.44156: done queuing things up, now waiting for results queue to drain 34052 1727204435.44158: waiting for pending results... 34052 1727204435.44588: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34052 1727204435.44626: in run() - task 127b8e07-fff9-66a4-e2a3-00000000002a 34052 1727204435.44649: variable 'ansible_search_path' from source: unknown 34052 1727204435.44656: variable 'ansible_search_path' from source: unknown 34052 1727204435.44721: calling self._execute() 34052 1727204435.44829: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.44902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.44906: variable 'omit' from source: magic vars 34052 1727204435.45282: variable 'ansible_distribution_major_version' from source: facts 34052 1727204435.45301: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204435.45313: variable 'omit' from source: magic vars 34052 1727204435.45387: variable 'omit' from source: magic vars 34052 1727204435.45431: variable 'omit' from source: magic vars 34052 1727204435.45484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204435.45533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204435.45567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204435.45771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.45774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.45777: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204435.45779: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.45781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.45784: Set connection var ansible_connection to ssh 34052 1727204435.45786: Set connection var ansible_timeout to 10 34052 1727204435.45788: Set connection var ansible_pipelining to False 34052 1727204435.45790: Set connection var ansible_shell_type to sh 34052 1727204435.45801: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204435.45813: Set connection var ansible_shell_executable to /bin/sh 34052 1727204435.45842: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.45849: variable 'ansible_connection' from source: unknown 34052 1727204435.45856: variable 'ansible_module_compression' from source: unknown 34052 1727204435.45862: variable 'ansible_shell_type' from source: unknown 34052 1727204435.45870: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.45876: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.45882: variable 'ansible_pipelining' from source: unknown 34052 1727204435.45887: variable 'ansible_timeout' from source: unknown 34052 1727204435.45893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.46061: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204435.46082: variable 'omit' from source: magic vars 34052 1727204435.46092: starting attempt loop 34052 1727204435.46098: running the handler 34052 1727204435.46158: variable '__network_connections_result' from source: set_fact 34052 1727204435.46253: variable '__network_connections_result' from source: set_fact 34052 1727204435.46396: handler run complete 34052 1727204435.46433: attempt loop complete, returning result 34052 1727204435.46443: _execute() done 34052 1727204435.46454: dumping result to json 34052 1727204435.46561: done dumping result, returning 34052 1727204435.46565: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-66a4-e2a3-00000000002a] 34052 1727204435.46569: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002a 34052 1727204435.46651: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002a 34052 1727204435.46654: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, cbb7d200-7555-4a5b-af25-f6d228b691ef (not-active)" ] } } 34052 1727204435.46768: no more pending results, returning what we have 34052 1727204435.46771: results queue empty 34052 1727204435.46772: checking for any_errors_fatal 34052 1727204435.46781: done checking for any_errors_fatal 34052 1727204435.46782: checking for max_fail_percentage 34052 1727204435.46783: done checking for max_fail_percentage 34052 1727204435.46784: checking to see if all hosts have failed and the running result is not ok 34052 1727204435.46785: done checking to see if all hosts have failed 34052 1727204435.46786: getting the remaining hosts for this loop 34052 1727204435.46788: done getting the remaining hosts for this loop 34052 1727204435.46792: getting the next task for host managed-node1 34052 1727204435.46798: done getting next task for host managed-node1 34052 1727204435.46802: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34052 1727204435.46806: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204435.46818: getting variables 34052 1727204435.46820: in VariableManager get_vars() 34052 1727204435.46863: Calling all_inventory to load vars for managed-node1 34052 1727204435.46980: Calling groups_inventory to load vars for managed-node1 34052 1727204435.46983: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204435.46996: Calling all_plugins_play to load vars for managed-node1 34052 1727204435.46999: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204435.47003: Calling groups_plugins_play to load vars for managed-node1 34052 1727204435.49447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204435.53103: done with get_vars() 34052 1727204435.53144: done getting variables 34052 1727204435.53213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:00:35 -0400 (0:00:00.094) 0:00:21.849 ***** 34052 1727204435.53250: entering _queue_task() for managed-node1/debug 34052 1727204435.53635: worker is 1 (out of 1 available) 34052 1727204435.53651: exiting _queue_task() for managed-node1/debug 34052 1727204435.53667: done queuing things up, now waiting for results queue to drain 34052 1727204435.53669: waiting for pending results... 34052 1727204435.53990: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34052 1727204435.54140: in run() - task 127b8e07-fff9-66a4-e2a3-00000000002b 34052 1727204435.54164: variable 'ansible_search_path' from source: unknown 34052 1727204435.54174: variable 'ansible_search_path' from source: unknown 34052 1727204435.54271: calling self._execute() 34052 1727204435.54337: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.54348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.54363: variable 'omit' from source: magic vars 34052 1727204435.54796: variable 'ansible_distribution_major_version' from source: facts 34052 1727204435.54816: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204435.54962: variable 'network_state' from source: role '' defaults 34052 1727204435.54982: Evaluated conditional (network_state != {}): False 34052 1727204435.55058: when evaluation is False, skipping this task 34052 1727204435.55061: _execute() done 34052 1727204435.55064: dumping result to json 34052 1727204435.55067: done dumping result, returning 34052 1727204435.55070: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-66a4-e2a3-00000000002b] 34052 1727204435.55072: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002b 34052 1727204435.55141: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002b 34052 1727204435.55145: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 34052 1727204435.55219: no more pending results, returning what we have 34052 1727204435.55224: results queue empty 34052 1727204435.55225: checking for any_errors_fatal 34052 1727204435.55238: done checking for any_errors_fatal 34052 1727204435.55239: checking for max_fail_percentage 34052 1727204435.55240: done checking for max_fail_percentage 34052 1727204435.55241: checking to see if all hosts have failed and the running result is not ok 34052 1727204435.55243: done checking to see if all hosts have failed 34052 1727204435.55244: getting the remaining hosts for this loop 34052 1727204435.55246: done getting the remaining hosts for this loop 34052 1727204435.55251: getting the next task for host managed-node1 34052 1727204435.55258: done getting next task for host managed-node1 34052 1727204435.55263: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34052 1727204435.55270: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204435.55287: getting variables 34052 1727204435.55289: in VariableManager get_vars() 34052 1727204435.55338: Calling all_inventory to load vars for managed-node1 34052 1727204435.55342: Calling groups_inventory to load vars for managed-node1 34052 1727204435.55344: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204435.55359: Calling all_plugins_play to load vars for managed-node1 34052 1727204435.55363: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204435.55573: Calling groups_plugins_play to load vars for managed-node1 34052 1727204435.57599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204435.61053: done with get_vars() 34052 1727204435.61097: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:00:35 -0400 (0:00:00.081) 0:00:21.930 ***** 34052 1727204435.61360: entering _queue_task() for managed-node1/ping 34052 1727204435.61362: Creating lock for ping 34052 1727204435.61972: worker is 1 (out of 1 available) 34052 1727204435.61985: exiting _queue_task() for managed-node1/ping 34052 1727204435.61996: done queuing things up, now waiting for results queue to drain 34052 1727204435.61997: waiting for pending results... 34052 1727204435.62190: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34052 1727204435.62303: in run() - task 127b8e07-fff9-66a4-e2a3-00000000002c 34052 1727204435.62371: variable 'ansible_search_path' from source: unknown 34052 1727204435.62374: variable 'ansible_search_path' from source: unknown 34052 1727204435.62389: calling self._execute() 34052 1727204435.62506: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.62523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.62540: variable 'omit' from source: magic vars 34052 1727204435.63014: variable 'ansible_distribution_major_version' from source: facts 34052 1727204435.63069: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204435.63074: variable 'omit' from source: magic vars 34052 1727204435.63127: variable 'omit' from source: magic vars 34052 1727204435.63175: variable 'omit' from source: magic vars 34052 1727204435.63228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204435.63307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204435.63370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204435.63374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.63377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204435.63417: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204435.63427: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.63436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.63556: Set connection var ansible_connection to ssh 34052 1727204435.63573: Set connection var ansible_timeout to 10 34052 1727204435.63584: Set connection var ansible_pipelining to False 34052 1727204435.63610: Set connection var ansible_shell_type to sh 34052 1727204435.63613: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204435.63619: Set connection var ansible_shell_executable to /bin/sh 34052 1727204435.63654: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.63826: variable 'ansible_connection' from source: unknown 34052 1727204435.63832: variable 'ansible_module_compression' from source: unknown 34052 1727204435.63835: variable 'ansible_shell_type' from source: unknown 34052 1727204435.63837: variable 'ansible_shell_executable' from source: unknown 34052 1727204435.63840: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204435.63843: variable 'ansible_pipelining' from source: unknown 34052 1727204435.63845: variable 'ansible_timeout' from source: unknown 34052 1727204435.63847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204435.64090: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204435.64108: variable 'omit' from source: magic vars 34052 1727204435.64155: starting attempt loop 34052 1727204435.64158: running the handler 34052 1727204435.64160: _low_level_execute_command(): starting 34052 1727204435.64163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204435.66214: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204435.66518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204435.66562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204435.68345: stdout chunk (state=3): >>>/root <<< 34052 1727204435.68553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204435.68557: stdout chunk (state=3): >>><<< 34052 1727204435.68559: stderr chunk (state=3): >>><<< 34052 1727204435.68589: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204435.68607: _low_level_execute_command(): starting 34052 1727204435.68611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133 `" && echo ansible-tmp-1727204435.6859071-35358-92904023034133="` echo /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133 `" ) && sleep 0' 34052 1727204435.69911: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204435.69917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204435.70089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204435.70193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204435.70290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204435.72363: stdout chunk (state=3): >>>ansible-tmp-1727204435.6859071-35358-92904023034133=/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133 <<< 34052 1727204435.72479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204435.72703: stderr chunk (state=3): >>><<< 34052 1727204435.72707: stdout chunk (state=3): >>><<< 34052 1727204435.72835: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204435.6859071-35358-92904023034133=/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204435.72838: variable 'ansible_module_compression' from source: unknown 34052 1727204435.73094: ANSIBALLZ: Using lock for ping 34052 1727204435.73098: ANSIBALLZ: Acquiring lock 34052 1727204435.73100: ANSIBALLZ: Lock acquired: 140141528326864 34052 1727204435.73103: ANSIBALLZ: Creating module 34052 1727204435.93044: ANSIBALLZ: Writing module into payload 34052 1727204435.93135: ANSIBALLZ: Writing module 34052 1727204435.93164: ANSIBALLZ: Renaming module 34052 1727204435.93181: ANSIBALLZ: Done creating module 34052 1727204435.93205: variable 'ansible_facts' from source: unknown 34052 1727204435.93283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py 34052 1727204435.93456: Sending initial data 34052 1727204435.93459: Sent initial data (152 bytes) 34052 1727204435.94230: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204435.94298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204435.94335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204435.94378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204435.94569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204435.96569: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204435.96579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204435.96635: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpftly3122 /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py <<< 34052 1727204435.96645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py" <<< 34052 1727204435.96681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpftly3122" to remote "/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py" <<< 34052 1727204435.98274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204435.98281: stderr chunk (state=3): >>><<< 34052 1727204435.98284: stdout chunk (state=3): >>><<< 34052 1727204435.98286: done transferring module to remote 34052 1727204435.98289: _low_level_execute_command(): starting 34052 1727204435.98291: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/ /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py && sleep 0' 34052 1727204435.99574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204435.99846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204435.99983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.00081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.02244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.02254: stdout chunk (state=3): >>><<< 34052 1727204436.02257: stderr chunk (state=3): >>><<< 34052 1727204436.02382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.02386: _low_level_execute_command(): starting 34052 1727204436.02389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/AnsiballZ_ping.py && sleep 0' 34052 1727204436.03596: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.03790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.03981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.04039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.20849: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34052 1727204436.22229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204436.22298: stderr chunk (state=3): >>><<< 34052 1727204436.22316: stdout chunk (state=3): >>><<< 34052 1727204436.22388: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204436.22456: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204436.22480: _low_level_execute_command(): starting 34052 1727204436.22491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204435.6859071-35358-92904023034133/ > /dev/null 2>&1 && sleep 0' 34052 1727204436.23713: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204436.23780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204436.23797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.23815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204436.23882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.23939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.23960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.23989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.24316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.26183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.26286: stderr chunk (state=3): >>><<< 34052 1727204436.26294: stdout chunk (state=3): >>><<< 34052 1727204436.26327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.26336: handler run complete 34052 1727204436.26350: attempt loop complete, returning result 34052 1727204436.26353: _execute() done 34052 1727204436.26356: dumping result to json 34052 1727204436.26358: done dumping result, returning 34052 1727204436.26471: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-66a4-e2a3-00000000002c] 34052 1727204436.26474: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002c ok: [managed-node1] => { "changed": false, "ping": "pong" } 34052 1727204436.26676: no more pending results, returning what we have 34052 1727204436.26679: results queue empty 34052 1727204436.26680: checking for any_errors_fatal 34052 1727204436.26688: done checking for any_errors_fatal 34052 1727204436.26688: checking for max_fail_percentage 34052 1727204436.26690: done checking for max_fail_percentage 34052 1727204436.26691: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.26692: done checking to see if all hosts have failed 34052 1727204436.26693: getting the remaining hosts for this loop 34052 1727204436.26694: done getting the remaining hosts for this loop 34052 1727204436.26699: getting the next task for host managed-node1 34052 1727204436.26708: done getting next task for host managed-node1 34052 1727204436.26710: ^ task is: TASK: meta (role_complete) 34052 1727204436.26714: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.26726: getting variables 34052 1727204436.26727: in VariableManager get_vars() 34052 1727204436.26991: Calling all_inventory to load vars for managed-node1 34052 1727204436.26995: Calling groups_inventory to load vars for managed-node1 34052 1727204436.26998: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.27005: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000002c 34052 1727204436.27008: WORKER PROCESS EXITING 34052 1727204436.27018: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.27022: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.27026: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.28912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.30590: done with get_vars() 34052 1727204436.30636: done getting variables 34052 1727204436.30739: done queuing things up, now waiting for results queue to drain 34052 1727204436.30742: results queue empty 34052 1727204436.30742: checking for any_errors_fatal 34052 1727204436.30746: done checking for any_errors_fatal 34052 1727204436.30746: checking for max_fail_percentage 34052 1727204436.30747: done checking for max_fail_percentage 34052 1727204436.30748: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.30749: done checking to see if all hosts have failed 34052 1727204436.30749: getting the remaining hosts for this loop 34052 1727204436.30750: done getting the remaining hosts for this loop 34052 1727204436.30753: getting the next task for host managed-node1 34052 1727204436.30757: done getting next task for host managed-node1 34052 1727204436.30759: ^ task is: TASK: Include the task 'assert_device_present.yml' 34052 1727204436.30761: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.30763: getting variables 34052 1727204436.30764: in VariableManager get_vars() 34052 1727204436.30783: Calling all_inventory to load vars for managed-node1 34052 1727204436.30785: Calling groups_inventory to load vars for managed-node1 34052 1727204436.30787: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.30791: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.30793: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.30796: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.32779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.34454: done with get_vars() 34052 1727204436.34484: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.731) 0:00:22.662 ***** 34052 1727204436.34550: entering _queue_task() for managed-node1/include_tasks 34052 1727204436.34841: worker is 1 (out of 1 available) 34052 1727204436.34856: exiting _queue_task() for managed-node1/include_tasks 34052 1727204436.34872: done queuing things up, now waiting for results queue to drain 34052 1727204436.34874: waiting for pending results... 34052 1727204436.35068: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 34052 1727204436.35175: in run() - task 127b8e07-fff9-66a4-e2a3-00000000005c 34052 1727204436.35198: variable 'ansible_search_path' from source: unknown 34052 1727204436.35241: calling self._execute() 34052 1727204436.35383: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.35480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.35498: variable 'omit' from source: magic vars 34052 1727204436.36372: variable 'ansible_distribution_major_version' from source: facts 34052 1727204436.36393: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204436.36404: _execute() done 34052 1727204436.36412: dumping result to json 34052 1727204436.36418: done dumping result, returning 34052 1727204436.36428: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [127b8e07-fff9-66a4-e2a3-00000000005c] 34052 1727204436.36436: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005c 34052 1727204436.36573: no more pending results, returning what we have 34052 1727204436.36580: in VariableManager get_vars() 34052 1727204436.36641: Calling all_inventory to load vars for managed-node1 34052 1727204436.36644: Calling groups_inventory to load vars for managed-node1 34052 1727204436.36646: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.36668: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.36672: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.36677: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.37287: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005c 34052 1727204436.37291: WORKER PROCESS EXITING 34052 1727204436.42186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.43351: done with get_vars() 34052 1727204436.43378: variable 'ansible_search_path' from source: unknown 34052 1727204436.43391: we have included files to process 34052 1727204436.43392: generating all_blocks data 34052 1727204436.43394: done generating all_blocks data 34052 1727204436.43396: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34052 1727204436.43397: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34052 1727204436.43399: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34052 1727204436.43510: in VariableManager get_vars() 34052 1727204436.43529: done with get_vars() 34052 1727204436.43613: done processing included file 34052 1727204436.43614: iterating over new_blocks loaded from include file 34052 1727204436.43615: in VariableManager get_vars() 34052 1727204436.43629: done with get_vars() 34052 1727204436.43630: filtering new block on tags 34052 1727204436.43644: done filtering new block on tags 34052 1727204436.43646: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 34052 1727204436.43649: extending task lists for all hosts with included blocks 34052 1727204436.45007: done extending task lists 34052 1727204436.45009: done processing included files 34052 1727204436.45010: results queue empty 34052 1727204436.45010: checking for any_errors_fatal 34052 1727204436.45012: done checking for any_errors_fatal 34052 1727204436.45012: checking for max_fail_percentage 34052 1727204436.45013: done checking for max_fail_percentage 34052 1727204436.45014: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.45014: done checking to see if all hosts have failed 34052 1727204436.45015: getting the remaining hosts for this loop 34052 1727204436.45016: done getting the remaining hosts for this loop 34052 1727204436.45017: getting the next task for host managed-node1 34052 1727204436.45020: done getting next task for host managed-node1 34052 1727204436.45022: ^ task is: TASK: Include the task 'get_interface_stat.yml' 34052 1727204436.45024: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.45026: getting variables 34052 1727204436.45027: in VariableManager get_vars() 34052 1727204436.45041: Calling all_inventory to load vars for managed-node1 34052 1727204436.45043: Calling groups_inventory to load vars for managed-node1 34052 1727204436.45045: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.45052: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.45053: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.45055: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.45959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.47145: done with get_vars() 34052 1727204436.47171: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.126) 0:00:22.789 ***** 34052 1727204436.47235: entering _queue_task() for managed-node1/include_tasks 34052 1727204436.47534: worker is 1 (out of 1 available) 34052 1727204436.47548: exiting _queue_task() for managed-node1/include_tasks 34052 1727204436.47561: done queuing things up, now waiting for results queue to drain 34052 1727204436.47563: waiting for pending results... 34052 1727204436.47755: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 34052 1727204436.47828: in run() - task 127b8e07-fff9-66a4-e2a3-0000000002b5 34052 1727204436.47844: variable 'ansible_search_path' from source: unknown 34052 1727204436.47848: variable 'ansible_search_path' from source: unknown 34052 1727204436.47882: calling self._execute() 34052 1727204436.47969: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.47973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.47983: variable 'omit' from source: magic vars 34052 1727204436.48306: variable 'ansible_distribution_major_version' from source: facts 34052 1727204436.48316: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204436.48323: _execute() done 34052 1727204436.48326: dumping result to json 34052 1727204436.48337: done dumping result, returning 34052 1727204436.48340: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-66a4-e2a3-0000000002b5] 34052 1727204436.48343: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000002b5 34052 1727204436.48437: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000002b5 34052 1727204436.48442: WORKER PROCESS EXITING 34052 1727204436.48472: no more pending results, returning what we have 34052 1727204436.48478: in VariableManager get_vars() 34052 1727204436.48529: Calling all_inventory to load vars for managed-node1 34052 1727204436.48532: Calling groups_inventory to load vars for managed-node1 34052 1727204436.48534: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.48553: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.48556: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.48560: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.49591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.50783: done with get_vars() 34052 1727204436.50808: variable 'ansible_search_path' from source: unknown 34052 1727204436.50810: variable 'ansible_search_path' from source: unknown 34052 1727204436.50844: we have included files to process 34052 1727204436.50846: generating all_blocks data 34052 1727204436.50847: done generating all_blocks data 34052 1727204436.50849: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34052 1727204436.50850: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34052 1727204436.50851: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34052 1727204436.51035: done processing included file 34052 1727204436.51037: iterating over new_blocks loaded from include file 34052 1727204436.51038: in VariableManager get_vars() 34052 1727204436.51053: done with get_vars() 34052 1727204436.51054: filtering new block on tags 34052 1727204436.51068: done filtering new block on tags 34052 1727204436.51070: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 34052 1727204436.51075: extending task lists for all hosts with included blocks 34052 1727204436.51143: done extending task lists 34052 1727204436.51144: done processing included files 34052 1727204436.51145: results queue empty 34052 1727204436.51145: checking for any_errors_fatal 34052 1727204436.51148: done checking for any_errors_fatal 34052 1727204436.51149: checking for max_fail_percentage 34052 1727204436.51150: done checking for max_fail_percentage 34052 1727204436.51150: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.51151: done checking to see if all hosts have failed 34052 1727204436.51151: getting the remaining hosts for this loop 34052 1727204436.51152: done getting the remaining hosts for this loop 34052 1727204436.51154: getting the next task for host managed-node1 34052 1727204436.51156: done getting next task for host managed-node1 34052 1727204436.51158: ^ task is: TASK: Get stat for interface {{ interface }} 34052 1727204436.51160: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.51162: getting variables 34052 1727204436.51163: in VariableManager get_vars() 34052 1727204436.51176: Calling all_inventory to load vars for managed-node1 34052 1727204436.51178: Calling groups_inventory to load vars for managed-node1 34052 1727204436.51179: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.51183: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.51185: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.51187: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.52133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.53310: done with get_vars() 34052 1727204436.53340: done getting variables 34052 1727204436.53480: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.062) 0:00:22.852 ***** 34052 1727204436.53504: entering _queue_task() for managed-node1/stat 34052 1727204436.53833: worker is 1 (out of 1 available) 34052 1727204436.53849: exiting _queue_task() for managed-node1/stat 34052 1727204436.53861: done queuing things up, now waiting for results queue to drain 34052 1727204436.53863: waiting for pending results... 34052 1727204436.54056: running TaskExecutor() for managed-node1/TASK: Get stat for interface veth0 34052 1727204436.54135: in run() - task 127b8e07-fff9-66a4-e2a3-0000000003a0 34052 1727204436.54148: variable 'ansible_search_path' from source: unknown 34052 1727204436.54152: variable 'ansible_search_path' from source: unknown 34052 1727204436.54189: calling self._execute() 34052 1727204436.54277: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.54282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.54291: variable 'omit' from source: magic vars 34052 1727204436.54611: variable 'ansible_distribution_major_version' from source: facts 34052 1727204436.54621: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204436.54631: variable 'omit' from source: magic vars 34052 1727204436.54672: variable 'omit' from source: magic vars 34052 1727204436.54749: variable 'interface' from source: play vars 34052 1727204436.54767: variable 'omit' from source: magic vars 34052 1727204436.54803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204436.54837: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204436.54855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204436.54875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204436.54884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204436.54909: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204436.54913: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.54916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.54994: Set connection var ansible_connection to ssh 34052 1727204436.55001: Set connection var ansible_timeout to 10 34052 1727204436.55008: Set connection var ansible_pipelining to False 34052 1727204436.55011: Set connection var ansible_shell_type to sh 34052 1727204436.55018: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204436.55027: Set connection var ansible_shell_executable to /bin/sh 34052 1727204436.55046: variable 'ansible_shell_executable' from source: unknown 34052 1727204436.55050: variable 'ansible_connection' from source: unknown 34052 1727204436.55052: variable 'ansible_module_compression' from source: unknown 34052 1727204436.55055: variable 'ansible_shell_type' from source: unknown 34052 1727204436.55058: variable 'ansible_shell_executable' from source: unknown 34052 1727204436.55060: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.55065: variable 'ansible_pipelining' from source: unknown 34052 1727204436.55070: variable 'ansible_timeout' from source: unknown 34052 1727204436.55075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.55240: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204436.55250: variable 'omit' from source: magic vars 34052 1727204436.55256: starting attempt loop 34052 1727204436.55258: running the handler 34052 1727204436.55274: _low_level_execute_command(): starting 34052 1727204436.55281: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204436.55846: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.55852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204436.55855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.55857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.55911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.55914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.55917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.55983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.57748: stdout chunk (state=3): >>>/root <<< 34052 1727204436.57849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.57914: stderr chunk (state=3): >>><<< 34052 1727204436.57918: stdout chunk (state=3): >>><<< 34052 1727204436.57946: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.57956: _low_level_execute_command(): starting 34052 1727204436.57963: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290 `" && echo ansible-tmp-1727204436.579443-35545-270866446537290="` echo /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290 `" ) && sleep 0' 34052 1727204436.58516: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.58520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.58534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.58536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.58585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.58592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.58595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.58645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.60787: stdout chunk (state=3): >>>ansible-tmp-1727204436.579443-35545-270866446537290=/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290 <<< 34052 1727204436.60864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.61309: stderr chunk (state=3): >>><<< 34052 1727204436.61313: stdout chunk (state=3): >>><<< 34052 1727204436.61316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204436.579443-35545-270866446537290=/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.61318: variable 'ansible_module_compression' from source: unknown 34052 1727204436.61537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34052 1727204436.61698: variable 'ansible_facts' from source: unknown 34052 1727204436.61914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py 34052 1727204436.62444: Sending initial data 34052 1727204436.62458: Sent initial data (152 bytes) 34052 1727204436.63180: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204436.63262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.63323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.63363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.63584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.63598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.65250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204436.65292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204436.65343: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp1pe04bu5 /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py <<< 34052 1727204436.65346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py" <<< 34052 1727204436.65386: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp1pe04bu5" to remote "/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py" <<< 34052 1727204436.66022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.66187: stderr chunk (state=3): >>><<< 34052 1727204436.66190: stdout chunk (state=3): >>><<< 34052 1727204436.66193: done transferring module to remote 34052 1727204436.66195: _low_level_execute_command(): starting 34052 1727204436.66197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/ /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py && sleep 0' 34052 1727204436.66635: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204436.66649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.66671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.66758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.66805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.68775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.68781: stdout chunk (state=3): >>><<< 34052 1727204436.68784: stderr chunk (state=3): >>><<< 34052 1727204436.68796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.68799: _low_level_execute_command(): starting 34052 1727204436.68805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/AnsiballZ_stat.py && sleep 0' 34052 1727204436.69304: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.69308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.69311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.69313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.69373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.69377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.69384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.69455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.86448: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37957, "dev": 23, "nlink": 1, "atime": 1727204422.501025, "mtime": 1727204422.501025, "ctime": 1727204422.501025, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34052 1727204436.87932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204436.87992: stderr chunk (state=3): >>><<< 34052 1727204436.87996: stdout chunk (state=3): >>><<< 34052 1727204436.88012: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37957, "dev": 23, "nlink": 1, "atime": 1727204422.501025, "mtime": 1727204422.501025, "ctime": 1727204422.501025, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204436.88061: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204436.88073: _low_level_execute_command(): starting 34052 1727204436.88078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204436.579443-35545-270866446537290/ > /dev/null 2>&1 && sleep 0' 34052 1727204436.88558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204436.88563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.88590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204436.88594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204436.88658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204436.88661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204436.88664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204436.88722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204436.90758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204436.90763: stdout chunk (state=3): >>><<< 34052 1727204436.90768: stderr chunk (state=3): >>><<< 34052 1727204436.90785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204436.90791: handler run complete 34052 1727204436.90825: attempt loop complete, returning result 34052 1727204436.90830: _execute() done 34052 1727204436.90833: dumping result to json 34052 1727204436.90839: done dumping result, returning 34052 1727204436.90847: done running TaskExecutor() for managed-node1/TASK: Get stat for interface veth0 [127b8e07-fff9-66a4-e2a3-0000000003a0] 34052 1727204436.90852: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003a0 34052 1727204436.90967: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003a0 34052 1727204436.90970: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204422.501025, "block_size": 4096, "blocks": 0, "ctime": 1727204422.501025, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 37957, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727204422.501025, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 34052 1727204436.91087: no more pending results, returning what we have 34052 1727204436.91090: results queue empty 34052 1727204436.91091: checking for any_errors_fatal 34052 1727204436.91092: done checking for any_errors_fatal 34052 1727204436.91093: checking for max_fail_percentage 34052 1727204436.91095: done checking for max_fail_percentage 34052 1727204436.91095: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.91096: done checking to see if all hosts have failed 34052 1727204436.91097: getting the remaining hosts for this loop 34052 1727204436.91099: done getting the remaining hosts for this loop 34052 1727204436.91103: getting the next task for host managed-node1 34052 1727204436.91111: done getting next task for host managed-node1 34052 1727204436.91114: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 34052 1727204436.91116: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.91121: getting variables 34052 1727204436.91122: in VariableManager get_vars() 34052 1727204436.91160: Calling all_inventory to load vars for managed-node1 34052 1727204436.91163: Calling groups_inventory to load vars for managed-node1 34052 1727204436.91170: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.91181: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.91184: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.91187: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.92309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204436.93525: done with get_vars() 34052 1727204436.93556: done getting variables 34052 1727204436.93663: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34052 1727204436.93758: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.402) 0:00:23.255 ***** 34052 1727204436.93785: entering _queue_task() for managed-node1/assert 34052 1727204436.93786: Creating lock for assert 34052 1727204436.94110: worker is 1 (out of 1 available) 34052 1727204436.94128: exiting _queue_task() for managed-node1/assert 34052 1727204436.94142: done queuing things up, now waiting for results queue to drain 34052 1727204436.94145: waiting for pending results... 34052 1727204436.94350: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'veth0' 34052 1727204436.94437: in run() - task 127b8e07-fff9-66a4-e2a3-0000000002b6 34052 1727204436.94453: variable 'ansible_search_path' from source: unknown 34052 1727204436.94457: variable 'ansible_search_path' from source: unknown 34052 1727204436.94494: calling self._execute() 34052 1727204436.94597: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.94601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.94612: variable 'omit' from source: magic vars 34052 1727204436.94960: variable 'ansible_distribution_major_version' from source: facts 34052 1727204436.94974: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204436.94982: variable 'omit' from source: magic vars 34052 1727204436.95034: variable 'omit' from source: magic vars 34052 1727204436.95270: variable 'interface' from source: play vars 34052 1727204436.95274: variable 'omit' from source: magic vars 34052 1727204436.95277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204436.95310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204436.95342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204436.95373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204436.95392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204436.95434: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204436.95444: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.95453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.95615: Set connection var ansible_connection to ssh 34052 1727204436.95633: Set connection var ansible_timeout to 10 34052 1727204436.95648: Set connection var ansible_pipelining to False 34052 1727204436.95656: Set connection var ansible_shell_type to sh 34052 1727204436.95673: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204436.95687: Set connection var ansible_shell_executable to /bin/sh 34052 1727204436.95738: variable 'ansible_shell_executable' from source: unknown 34052 1727204436.95971: variable 'ansible_connection' from source: unknown 34052 1727204436.95974: variable 'ansible_module_compression' from source: unknown 34052 1727204436.95981: variable 'ansible_shell_type' from source: unknown 34052 1727204436.95986: variable 'ansible_shell_executable' from source: unknown 34052 1727204436.95996: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204436.96002: variable 'ansible_pipelining' from source: unknown 34052 1727204436.96006: variable 'ansible_timeout' from source: unknown 34052 1727204436.96010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204436.96039: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204436.96060: variable 'omit' from source: magic vars 34052 1727204436.96074: starting attempt loop 34052 1727204436.96086: running the handler 34052 1727204436.96286: variable 'interface_stat' from source: set_fact 34052 1727204436.96305: Evaluated conditional (interface_stat.stat.exists): True 34052 1727204436.96317: handler run complete 34052 1727204436.96340: attempt loop complete, returning result 34052 1727204436.96348: _execute() done 34052 1727204436.96354: dumping result to json 34052 1727204436.96361: done dumping result, returning 34052 1727204436.96388: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'veth0' [127b8e07-fff9-66a4-e2a3-0000000002b6] 34052 1727204436.96402: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000002b6 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204436.96620: no more pending results, returning what we have 34052 1727204436.96623: results queue empty 34052 1727204436.96624: checking for any_errors_fatal 34052 1727204436.96636: done checking for any_errors_fatal 34052 1727204436.96637: checking for max_fail_percentage 34052 1727204436.96639: done checking for max_fail_percentage 34052 1727204436.96640: checking to see if all hosts have failed and the running result is not ok 34052 1727204436.96640: done checking to see if all hosts have failed 34052 1727204436.96641: getting the remaining hosts for this loop 34052 1727204436.96643: done getting the remaining hosts for this loop 34052 1727204436.96649: getting the next task for host managed-node1 34052 1727204436.96656: done getting next task for host managed-node1 34052 1727204436.96659: ^ task is: TASK: Include the task 'assert_profile_present.yml' 34052 1727204436.96661: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204436.96667: getting variables 34052 1727204436.96669: in VariableManager get_vars() 34052 1727204436.96716: Calling all_inventory to load vars for managed-node1 34052 1727204436.96720: Calling groups_inventory to load vars for managed-node1 34052 1727204436.96722: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204436.96869: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000002b6 34052 1727204436.96872: WORKER PROCESS EXITING 34052 1727204436.96884: Calling all_plugins_play to load vars for managed-node1 34052 1727204436.96888: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204436.96891: Calling groups_plugins_play to load vars for managed-node1 34052 1727204436.99443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.03593: done with get_vars() 34052 1727204437.03645: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.099) 0:00:23.354 ***** 34052 1727204437.04008: entering _queue_task() for managed-node1/include_tasks 34052 1727204437.04671: worker is 1 (out of 1 available) 34052 1727204437.04687: exiting _queue_task() for managed-node1/include_tasks 34052 1727204437.04703: done queuing things up, now waiting for results queue to drain 34052 1727204437.04705: waiting for pending results... 34052 1727204437.06144: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 34052 1727204437.06570: in run() - task 127b8e07-fff9-66a4-e2a3-00000000005d 34052 1727204437.06576: variable 'ansible_search_path' from source: unknown 34052 1727204437.06696: calling self._execute() 34052 1727204437.06935: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.06954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.07010: variable 'omit' from source: magic vars 34052 1727204437.08688: variable 'ansible_distribution_major_version' from source: facts 34052 1727204437.08695: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204437.08782: _execute() done 34052 1727204437.08842: dumping result to json 34052 1727204437.08846: done dumping result, returning 34052 1727204437.08864: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [127b8e07-fff9-66a4-e2a3-00000000005d] 34052 1727204437.08880: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005d 34052 1727204437.09286: no more pending results, returning what we have 34052 1727204437.09293: in VariableManager get_vars() 34052 1727204437.09352: Calling all_inventory to load vars for managed-node1 34052 1727204437.09473: Calling groups_inventory to load vars for managed-node1 34052 1727204437.09479: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.09497: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.09501: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.09505: Calling groups_plugins_play to load vars for managed-node1 34052 1727204437.10198: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005d 34052 1727204437.10205: WORKER PROCESS EXITING 34052 1727204437.12882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.15521: done with get_vars() 34052 1727204437.15563: variable 'ansible_search_path' from source: unknown 34052 1727204437.15586: we have included files to process 34052 1727204437.15587: generating all_blocks data 34052 1727204437.15589: done generating all_blocks data 34052 1727204437.15595: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34052 1727204437.15596: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34052 1727204437.15599: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34052 1727204437.15830: in VariableManager get_vars() 34052 1727204437.15858: done with get_vars() 34052 1727204437.16419: done processing included file 34052 1727204437.16422: iterating over new_blocks loaded from include file 34052 1727204437.16423: in VariableManager get_vars() 34052 1727204437.16448: done with get_vars() 34052 1727204437.16450: filtering new block on tags 34052 1727204437.16476: done filtering new block on tags 34052 1727204437.16479: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 34052 1727204437.16485: extending task lists for all hosts with included blocks 34052 1727204437.19561: done extending task lists 34052 1727204437.19563: done processing included files 34052 1727204437.19564: results queue empty 34052 1727204437.19568: checking for any_errors_fatal 34052 1727204437.19573: done checking for any_errors_fatal 34052 1727204437.19574: checking for max_fail_percentage 34052 1727204437.19575: done checking for max_fail_percentage 34052 1727204437.19577: checking to see if all hosts have failed and the running result is not ok 34052 1727204437.19578: done checking to see if all hosts have failed 34052 1727204437.19578: getting the remaining hosts for this loop 34052 1727204437.19580: done getting the remaining hosts for this loop 34052 1727204437.19583: getting the next task for host managed-node1 34052 1727204437.19588: done getting next task for host managed-node1 34052 1727204437.19590: ^ task is: TASK: Include the task 'get_profile_stat.yml' 34052 1727204437.19593: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204437.19596: getting variables 34052 1727204437.19597: in VariableManager get_vars() 34052 1727204437.19618: Calling all_inventory to load vars for managed-node1 34052 1727204437.19621: Calling groups_inventory to load vars for managed-node1 34052 1727204437.19623: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.19700: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.19704: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.19708: Calling groups_plugins_play to load vars for managed-node1 34052 1727204437.22291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.25554: done with get_vars() 34052 1727204437.25599: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.219) 0:00:23.574 ***** 34052 1727204437.25702: entering _queue_task() for managed-node1/include_tasks 34052 1727204437.26242: worker is 1 (out of 1 available) 34052 1727204437.26257: exiting _queue_task() for managed-node1/include_tasks 34052 1727204437.26387: done queuing things up, now waiting for results queue to drain 34052 1727204437.26390: waiting for pending results... 34052 1727204437.26789: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 34052 1727204437.26811: in run() - task 127b8e07-fff9-66a4-e2a3-0000000003b8 34052 1727204437.26835: variable 'ansible_search_path' from source: unknown 34052 1727204437.26839: variable 'ansible_search_path' from source: unknown 34052 1727204437.26921: calling self._execute() 34052 1727204437.27177: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.27181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.27186: variable 'omit' from source: magic vars 34052 1727204437.27780: variable 'ansible_distribution_major_version' from source: facts 34052 1727204437.27785: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204437.27788: _execute() done 34052 1727204437.27791: dumping result to json 34052 1727204437.27793: done dumping result, returning 34052 1727204437.27796: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-66a4-e2a3-0000000003b8] 34052 1727204437.27798: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003b8 34052 1727204437.27947: no more pending results, returning what we have 34052 1727204437.27954: in VariableManager get_vars() 34052 1727204437.28014: Calling all_inventory to load vars for managed-node1 34052 1727204437.28018: Calling groups_inventory to load vars for managed-node1 34052 1727204437.28020: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.28029: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003b8 34052 1727204437.28033: WORKER PROCESS EXITING 34052 1727204437.28289: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.28294: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.28299: Calling groups_plugins_play to load vars for managed-node1 34052 1727204437.32107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.36290: done with get_vars() 34052 1727204437.36335: variable 'ansible_search_path' from source: unknown 34052 1727204437.36337: variable 'ansible_search_path' from source: unknown 34052 1727204437.36387: we have included files to process 34052 1727204437.36388: generating all_blocks data 34052 1727204437.36390: done generating all_blocks data 34052 1727204437.36392: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34052 1727204437.36393: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34052 1727204437.36395: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34052 1727204437.37605: done processing included file 34052 1727204437.37608: iterating over new_blocks loaded from include file 34052 1727204437.37614: in VariableManager get_vars() 34052 1727204437.37641: done with get_vars() 34052 1727204437.37643: filtering new block on tags 34052 1727204437.37674: done filtering new block on tags 34052 1727204437.37677: in VariableManager get_vars() 34052 1727204437.37698: done with get_vars() 34052 1727204437.37699: filtering new block on tags 34052 1727204437.37729: done filtering new block on tags 34052 1727204437.37731: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 34052 1727204437.37737: extending task lists for all hosts with included blocks 34052 1727204437.37933: done extending task lists 34052 1727204437.37935: done processing included files 34052 1727204437.37940: results queue empty 34052 1727204437.37941: checking for any_errors_fatal 34052 1727204437.37946: done checking for any_errors_fatal 34052 1727204437.37946: checking for max_fail_percentage 34052 1727204437.37948: done checking for max_fail_percentage 34052 1727204437.37948: checking to see if all hosts have failed and the running result is not ok 34052 1727204437.37949: done checking to see if all hosts have failed 34052 1727204437.37950: getting the remaining hosts for this loop 34052 1727204437.37951: done getting the remaining hosts for this loop 34052 1727204437.37954: getting the next task for host managed-node1 34052 1727204437.37958: done getting next task for host managed-node1 34052 1727204437.37961: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 34052 1727204437.37964: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204437.37968: getting variables 34052 1727204437.37969: in VariableManager get_vars() 34052 1727204437.38073: Calling all_inventory to load vars for managed-node1 34052 1727204437.38076: Calling groups_inventory to load vars for managed-node1 34052 1727204437.38079: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.38085: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.38087: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.38090: Calling groups_plugins_play to load vars for managed-node1 34052 1727204437.39701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.42005: done with get_vars() 34052 1727204437.42048: done getting variables 34052 1727204437.42113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.164) 0:00:23.738 ***** 34052 1727204437.42153: entering _queue_task() for managed-node1/set_fact 34052 1727204437.42785: worker is 1 (out of 1 available) 34052 1727204437.42798: exiting _queue_task() for managed-node1/set_fact 34052 1727204437.42811: done queuing things up, now waiting for results queue to drain 34052 1727204437.42813: waiting for pending results... 34052 1727204437.42988: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 34052 1727204437.43087: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b0 34052 1727204437.43156: variable 'ansible_search_path' from source: unknown 34052 1727204437.43160: variable 'ansible_search_path' from source: unknown 34052 1727204437.43179: calling self._execute() 34052 1727204437.43302: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.43315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.43334: variable 'omit' from source: magic vars 34052 1727204437.43822: variable 'ansible_distribution_major_version' from source: facts 34052 1727204437.43873: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204437.43877: variable 'omit' from source: magic vars 34052 1727204437.43935: variable 'omit' from source: magic vars 34052 1727204437.43988: variable 'omit' from source: magic vars 34052 1727204437.44137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204437.44143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204437.44146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204437.44157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204437.44179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204437.44216: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204437.44228: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.44238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.44373: Set connection var ansible_connection to ssh 34052 1727204437.44390: Set connection var ansible_timeout to 10 34052 1727204437.44464: Set connection var ansible_pipelining to False 34052 1727204437.44467: Set connection var ansible_shell_type to sh 34052 1727204437.44471: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204437.44473: Set connection var ansible_shell_executable to /bin/sh 34052 1727204437.44476: variable 'ansible_shell_executable' from source: unknown 34052 1727204437.44481: variable 'ansible_connection' from source: unknown 34052 1727204437.44490: variable 'ansible_module_compression' from source: unknown 34052 1727204437.44498: variable 'ansible_shell_type' from source: unknown 34052 1727204437.44506: variable 'ansible_shell_executable' from source: unknown 34052 1727204437.44513: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.44521: variable 'ansible_pipelining' from source: unknown 34052 1727204437.44531: variable 'ansible_timeout' from source: unknown 34052 1727204437.44540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.44740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204437.44760: variable 'omit' from source: magic vars 34052 1727204437.44792: starting attempt loop 34052 1727204437.44795: running the handler 34052 1727204437.44804: handler run complete 34052 1727204437.44827: attempt loop complete, returning result 34052 1727204437.44872: _execute() done 34052 1727204437.44875: dumping result to json 34052 1727204437.44878: done dumping result, returning 34052 1727204437.44881: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-66a4-e2a3-0000000004b0] 34052 1727204437.44883: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b0 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 34052 1727204437.45149: no more pending results, returning what we have 34052 1727204437.45153: results queue empty 34052 1727204437.45154: checking for any_errors_fatal 34052 1727204437.45156: done checking for any_errors_fatal 34052 1727204437.45157: checking for max_fail_percentage 34052 1727204437.45158: done checking for max_fail_percentage 34052 1727204437.45160: checking to see if all hosts have failed and the running result is not ok 34052 1727204437.45161: done checking to see if all hosts have failed 34052 1727204437.45161: getting the remaining hosts for this loop 34052 1727204437.45164: done getting the remaining hosts for this loop 34052 1727204437.45232: getting the next task for host managed-node1 34052 1727204437.45240: done getting next task for host managed-node1 34052 1727204437.45244: ^ task is: TASK: Stat profile file 34052 1727204437.45250: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204437.45256: getting variables 34052 1727204437.45258: in VariableManager get_vars() 34052 1727204437.45309: Calling all_inventory to load vars for managed-node1 34052 1727204437.45312: Calling groups_inventory to load vars for managed-node1 34052 1727204437.45315: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.45555: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b0 34052 1727204437.45559: WORKER PROCESS EXITING 34052 1727204437.45572: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.45576: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.45580: Calling groups_plugins_play to load vars for managed-node1 34052 1727204437.47456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204437.49755: done with get_vars() 34052 1727204437.49799: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.077) 0:00:23.816 ***** 34052 1727204437.49914: entering _queue_task() for managed-node1/stat 34052 1727204437.50386: worker is 1 (out of 1 available) 34052 1727204437.50401: exiting _queue_task() for managed-node1/stat 34052 1727204437.50414: done queuing things up, now waiting for results queue to drain 34052 1727204437.50416: waiting for pending results... 34052 1727204437.50704: running TaskExecutor() for managed-node1/TASK: Stat profile file 34052 1727204437.50935: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b1 34052 1727204437.50939: variable 'ansible_search_path' from source: unknown 34052 1727204437.50942: variable 'ansible_search_path' from source: unknown 34052 1727204437.50945: calling self._execute() 34052 1727204437.51059: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.51081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.51097: variable 'omit' from source: magic vars 34052 1727204437.51630: variable 'ansible_distribution_major_version' from source: facts 34052 1727204437.51634: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204437.51637: variable 'omit' from source: magic vars 34052 1727204437.51659: variable 'omit' from source: magic vars 34052 1727204437.51789: variable 'profile' from source: include params 34052 1727204437.51799: variable 'interface' from source: play vars 34052 1727204437.51890: variable 'interface' from source: play vars 34052 1727204437.51953: variable 'omit' from source: magic vars 34052 1727204437.51982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204437.52031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204437.52064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204437.52095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204437.52172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204437.52178: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204437.52181: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.52184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.52299: Set connection var ansible_connection to ssh 34052 1727204437.52313: Set connection var ansible_timeout to 10 34052 1727204437.52370: Set connection var ansible_pipelining to False 34052 1727204437.52374: Set connection var ansible_shell_type to sh 34052 1727204437.52376: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204437.52379: Set connection var ansible_shell_executable to /bin/sh 34052 1727204437.52398: variable 'ansible_shell_executable' from source: unknown 34052 1727204437.52415: variable 'ansible_connection' from source: unknown 34052 1727204437.52422: variable 'ansible_module_compression' from source: unknown 34052 1727204437.52433: variable 'ansible_shell_type' from source: unknown 34052 1727204437.52441: variable 'ansible_shell_executable' from source: unknown 34052 1727204437.52448: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204437.52457: variable 'ansible_pipelining' from source: unknown 34052 1727204437.52464: variable 'ansible_timeout' from source: unknown 34052 1727204437.52502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204437.52743: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204437.52763: variable 'omit' from source: magic vars 34052 1727204437.52779: starting attempt loop 34052 1727204437.52830: running the handler 34052 1727204437.52833: _low_level_execute_command(): starting 34052 1727204437.52838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204437.53808: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204437.53815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204437.54110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.54174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.55950: stdout chunk (state=3): >>>/root <<< 34052 1727204437.56172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.56286: stderr chunk (state=3): >>><<< 34052 1727204437.56297: stdout chunk (state=3): >>><<< 34052 1727204437.56493: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204437.56498: _low_level_execute_command(): starting 34052 1727204437.56502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426 `" && echo ansible-tmp-1727204437.5638824-35735-157336348616426="` echo /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426 `" ) && sleep 0' 34052 1727204437.57765: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204437.57797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204437.57988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204437.58022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204437.58138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204437.58155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.58250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.60342: stdout chunk (state=3): >>>ansible-tmp-1727204437.5638824-35735-157336348616426=/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426 <<< 34052 1727204437.60879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.60884: stdout chunk (state=3): >>><<< 34052 1727204437.60887: stderr chunk (state=3): >>><<< 34052 1727204437.60890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204437.5638824-35735-157336348616426=/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204437.60892: variable 'ansible_module_compression' from source: unknown 34052 1727204437.60895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34052 1727204437.60915: variable 'ansible_facts' from source: unknown 34052 1727204437.61135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py 34052 1727204437.61804: Sending initial data 34052 1727204437.61818: Sent initial data (153 bytes) 34052 1727204437.63070: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204437.63388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204437.63603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.63701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.65416: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204437.65470: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34052 1727204437.65474: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 34052 1727204437.65477: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 34052 1727204437.65479: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 34052 1727204437.65481: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 34052 1727204437.65482: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 34052 1727204437.65484: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 34052 1727204437.65486: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 34052 1727204437.65489: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204437.65774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204437.65809: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpc6dgdpur /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py <<< 34052 1727204437.65813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py" <<< 34052 1727204437.65867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpc6dgdpur" to remote "/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py" <<< 34052 1727204437.68304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.68308: stdout chunk (state=3): >>><<< 34052 1727204437.68310: stderr chunk (state=3): >>><<< 34052 1727204437.68364: done transferring module to remote 34052 1727204437.68370: _low_level_execute_command(): starting 34052 1727204437.68373: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/ /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py && sleep 0' 34052 1727204437.69486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204437.69700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204437.69711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204437.69714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204437.70169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204437.70390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.70474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.72777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.72781: stdout chunk (state=3): >>><<< 34052 1727204437.72786: stderr chunk (state=3): >>><<< 34052 1727204437.72788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204437.72791: _low_level_execute_command(): starting 34052 1727204437.72793: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/AnsiballZ_stat.py && sleep 0' 34052 1727204437.74224: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204437.74277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.74387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.91371: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34052 1727204437.92728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.92786: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204437.92991: stderr chunk (state=3): >>><<< 34052 1727204437.92994: stdout chunk (state=3): >>><<< 34052 1727204437.93129: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204437.93134: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204437.93137: _low_level_execute_command(): starting 34052 1727204437.93139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204437.5638824-35735-157336348616426/ > /dev/null 2>&1 && sleep 0' 34052 1727204437.94491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204437.94614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204437.94632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204437.94764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204437.94770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204437.96920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204437.96927: stdout chunk (state=3): >>><<< 34052 1727204437.96930: stderr chunk (state=3): >>><<< 34052 1727204437.97016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204437.97019: handler run complete 34052 1727204437.97076: attempt loop complete, returning result 34052 1727204437.97080: _execute() done 34052 1727204437.97082: dumping result to json 34052 1727204437.97092: done dumping result, returning 34052 1727204437.97107: done running TaskExecutor() for managed-node1/TASK: Stat profile file [127b8e07-fff9-66a4-e2a3-0000000004b1] 34052 1727204437.97110: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b1 34052 1727204437.97490: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b1 34052 1727204437.97493: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 34052 1727204437.97563: no more pending results, returning what we have 34052 1727204437.97568: results queue empty 34052 1727204437.97569: checking for any_errors_fatal 34052 1727204437.97577: done checking for any_errors_fatal 34052 1727204437.97578: checking for max_fail_percentage 34052 1727204437.97579: done checking for max_fail_percentage 34052 1727204437.97580: checking to see if all hosts have failed and the running result is not ok 34052 1727204437.97581: done checking to see if all hosts have failed 34052 1727204437.97581: getting the remaining hosts for this loop 34052 1727204437.97583: done getting the remaining hosts for this loop 34052 1727204437.97588: getting the next task for host managed-node1 34052 1727204437.97594: done getting next task for host managed-node1 34052 1727204437.97597: ^ task is: TASK: Set NM profile exist flag based on the profile files 34052 1727204437.97601: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204437.97606: getting variables 34052 1727204437.97609: in VariableManager get_vars() 34052 1727204437.97659: Calling all_inventory to load vars for managed-node1 34052 1727204437.97662: Calling groups_inventory to load vars for managed-node1 34052 1727204437.97664: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204437.97680: Calling all_plugins_play to load vars for managed-node1 34052 1727204437.97683: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204437.97687: Calling groups_plugins_play to load vars for managed-node1 34052 1727204438.01551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204438.06234: done with get_vars() 34052 1727204438.06279: done getting variables 34052 1727204438.06350: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:00:38 -0400 (0:00:00.565) 0:00:24.382 ***** 34052 1727204438.06494: entering _queue_task() for managed-node1/set_fact 34052 1727204438.07324: worker is 1 (out of 1 available) 34052 1727204438.07339: exiting _queue_task() for managed-node1/set_fact 34052 1727204438.07352: done queuing things up, now waiting for results queue to drain 34052 1727204438.07353: waiting for pending results... 34052 1727204438.07818: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 34052 1727204438.08054: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b2 34052 1727204438.08070: variable 'ansible_search_path' from source: unknown 34052 1727204438.08075: variable 'ansible_search_path' from source: unknown 34052 1727204438.08376: calling self._execute() 34052 1727204438.08488: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.08491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.08503: variable 'omit' from source: magic vars 34052 1727204438.09351: variable 'ansible_distribution_major_version' from source: facts 34052 1727204438.09418: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204438.09622: variable 'profile_stat' from source: set_fact 34052 1727204438.09640: Evaluated conditional (profile_stat.stat.exists): False 34052 1727204438.09644: when evaluation is False, skipping this task 34052 1727204438.09646: _execute() done 34052 1727204438.09650: dumping result to json 34052 1727204438.09652: done dumping result, returning 34052 1727204438.09662: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-66a4-e2a3-0000000004b2] 34052 1727204438.09851: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b2 34052 1727204438.09931: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b2 34052 1727204438.09935: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34052 1727204438.10017: no more pending results, returning what we have 34052 1727204438.10021: results queue empty 34052 1727204438.10022: checking for any_errors_fatal 34052 1727204438.10034: done checking for any_errors_fatal 34052 1727204438.10035: checking for max_fail_percentage 34052 1727204438.10037: done checking for max_fail_percentage 34052 1727204438.10038: checking to see if all hosts have failed and the running result is not ok 34052 1727204438.10039: done checking to see if all hosts have failed 34052 1727204438.10039: getting the remaining hosts for this loop 34052 1727204438.10041: done getting the remaining hosts for this loop 34052 1727204438.10046: getting the next task for host managed-node1 34052 1727204438.10052: done getting next task for host managed-node1 34052 1727204438.10055: ^ task is: TASK: Get NM profile info 34052 1727204438.10060: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204438.10067: getting variables 34052 1727204438.10069: in VariableManager get_vars() 34052 1727204438.10116: Calling all_inventory to load vars for managed-node1 34052 1727204438.10119: Calling groups_inventory to load vars for managed-node1 34052 1727204438.10121: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204438.10140: Calling all_plugins_play to load vars for managed-node1 34052 1727204438.10144: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204438.10148: Calling groups_plugins_play to load vars for managed-node1 34052 1727204438.13798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204438.18494: done with get_vars() 34052 1727204438.18542: done getting variables 34052 1727204438.18614: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:00:38 -0400 (0:00:00.121) 0:00:24.503 ***** 34052 1727204438.18655: entering _queue_task() for managed-node1/shell 34052 1727204438.19460: worker is 1 (out of 1 available) 34052 1727204438.19677: exiting _queue_task() for managed-node1/shell 34052 1727204438.19689: done queuing things up, now waiting for results queue to drain 34052 1727204438.19691: waiting for pending results... 34052 1727204438.20087: running TaskExecutor() for managed-node1/TASK: Get NM profile info 34052 1727204438.20258: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b3 34052 1727204438.20276: variable 'ansible_search_path' from source: unknown 34052 1727204438.20279: variable 'ansible_search_path' from source: unknown 34052 1727204438.20317: calling self._execute() 34052 1727204438.20670: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.20681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.20684: variable 'omit' from source: magic vars 34052 1727204438.21598: variable 'ansible_distribution_major_version' from source: facts 34052 1727204438.21612: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204438.21662: variable 'omit' from source: magic vars 34052 1727204438.21792: variable 'omit' from source: magic vars 34052 1727204438.22019: variable 'profile' from source: include params 34052 1727204438.22023: variable 'interface' from source: play vars 34052 1727204438.22251: variable 'interface' from source: play vars 34052 1727204438.22318: variable 'omit' from source: magic vars 34052 1727204438.22379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204438.22429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204438.22564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204438.22586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204438.22600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204438.22633: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204438.22636: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.22772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.22864: Set connection var ansible_connection to ssh 34052 1727204438.22992: Set connection var ansible_timeout to 10 34052 1727204438.23000: Set connection var ansible_pipelining to False 34052 1727204438.23003: Set connection var ansible_shell_type to sh 34052 1727204438.23013: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204438.23021: Set connection var ansible_shell_executable to /bin/sh 34052 1727204438.23053: variable 'ansible_shell_executable' from source: unknown 34052 1727204438.23056: variable 'ansible_connection' from source: unknown 34052 1727204438.23059: variable 'ansible_module_compression' from source: unknown 34052 1727204438.23062: variable 'ansible_shell_type' from source: unknown 34052 1727204438.23064: variable 'ansible_shell_executable' from source: unknown 34052 1727204438.23068: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.23107: variable 'ansible_pipelining' from source: unknown 34052 1727204438.23111: variable 'ansible_timeout' from source: unknown 34052 1727204438.23113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.23476: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204438.23490: variable 'omit' from source: magic vars 34052 1727204438.23496: starting attempt loop 34052 1727204438.23500: running the handler 34052 1727204438.23512: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204438.23650: _low_level_execute_command(): starting 34052 1727204438.23654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204438.25332: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204438.25338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.25486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.25540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.27495: stdout chunk (state=3): >>>/root <<< 34052 1727204438.27618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.27739: stderr chunk (state=3): >>><<< 34052 1727204438.27743: stdout chunk (state=3): >>><<< 34052 1727204438.27861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204438.27869: _low_level_execute_command(): starting 34052 1727204438.27952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952 `" && echo ansible-tmp-1727204438.277981-35755-240492053271952="` echo /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952 `" ) && sleep 0' 34052 1727204438.29549: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204438.29674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.29778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204438.29800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.29950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.32028: stdout chunk (state=3): >>>ansible-tmp-1727204438.277981-35755-240492053271952=/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952 <<< 34052 1727204438.32256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.32277: stderr chunk (state=3): >>><<< 34052 1727204438.32287: stdout chunk (state=3): >>><<< 34052 1727204438.32315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204438.277981-35755-240492053271952=/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204438.32365: variable 'ansible_module_compression' from source: unknown 34052 1727204438.32473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204438.32476: variable 'ansible_facts' from source: unknown 34052 1727204438.32561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py 34052 1727204438.32728: Sending initial data 34052 1727204438.32732: Sent initial data (155 bytes) 34052 1727204438.33567: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204438.33589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.33619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204438.33638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204438.33660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.33764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.35583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204438.35889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204438.35933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmplsrl9fr2 /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py <<< 34052 1727204438.35937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py" <<< 34052 1727204438.35989: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmplsrl9fr2" to remote "/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py" <<< 34052 1727204438.37327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.37529: stderr chunk (state=3): >>><<< 34052 1727204438.37533: stdout chunk (state=3): >>><<< 34052 1727204438.37536: done transferring module to remote 34052 1727204438.37539: _low_level_execute_command(): starting 34052 1727204438.37541: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/ /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py && sleep 0' 34052 1727204438.38832: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204438.38980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.39057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.39108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204438.39171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.39298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.41283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.41507: stderr chunk (state=3): >>><<< 34052 1727204438.41520: stdout chunk (state=3): >>><<< 34052 1727204438.41773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204438.41781: _low_level_execute_command(): starting 34052 1727204438.41784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/AnsiballZ_command.py && sleep 0' 34052 1727204438.42895: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204438.42956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.43124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204438.43282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.43319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.62438: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 15:00:38.603847", "end": "2024-09-24 15:00:38.623295", "delta": "0:00:00.019448", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204438.64275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.64387: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204438.64398: stdout chunk (state=3): >>><<< 34052 1727204438.64424: stderr chunk (state=3): >>><<< 34052 1727204438.64739: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 15:00:38.603847", "end": "2024-09-24 15:00:38.623295", "delta": "0:00:00.019448", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204438.64744: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204438.64747: _low_level_execute_command(): starting 34052 1727204438.64750: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204438.277981-35755-240492053271952/ > /dev/null 2>&1 && sleep 0' 34052 1727204438.65906: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204438.65922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204438.66180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204438.66280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204438.66331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204438.66392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204438.68388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204438.68496: stderr chunk (state=3): >>><<< 34052 1727204438.68507: stdout chunk (state=3): >>><<< 34052 1727204438.68879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204438.68883: handler run complete 34052 1727204438.68885: Evaluated conditional (False): False 34052 1727204438.68888: attempt loop complete, returning result 34052 1727204438.68890: _execute() done 34052 1727204438.68892: dumping result to json 34052 1727204438.68894: done dumping result, returning 34052 1727204438.68896: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [127b8e07-fff9-66a4-e2a3-0000000004b3] 34052 1727204438.68898: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b3 34052 1727204438.68990: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b3 34052 1727204438.68994: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.019448", "end": "2024-09-24 15:00:38.623295", "rc": 0, "start": "2024-09-24 15:00:38.603847" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 34052 1727204438.69085: no more pending results, returning what we have 34052 1727204438.69094: results queue empty 34052 1727204438.69095: checking for any_errors_fatal 34052 1727204438.69102: done checking for any_errors_fatal 34052 1727204438.69103: checking for max_fail_percentage 34052 1727204438.69104: done checking for max_fail_percentage 34052 1727204438.69105: checking to see if all hosts have failed and the running result is not ok 34052 1727204438.69106: done checking to see if all hosts have failed 34052 1727204438.69107: getting the remaining hosts for this loop 34052 1727204438.69109: done getting the remaining hosts for this loop 34052 1727204438.69114: getting the next task for host managed-node1 34052 1727204438.69122: done getting next task for host managed-node1 34052 1727204438.69125: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34052 1727204438.69130: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204438.69135: getting variables 34052 1727204438.69136: in VariableManager get_vars() 34052 1727204438.69441: Calling all_inventory to load vars for managed-node1 34052 1727204438.69444: Calling groups_inventory to load vars for managed-node1 34052 1727204438.69447: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204438.69461: Calling all_plugins_play to load vars for managed-node1 34052 1727204438.69467: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204438.69471: Calling groups_plugins_play to load vars for managed-node1 34052 1727204438.73661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204438.78202: done with get_vars() 34052 1727204438.78241: done getting variables 34052 1727204438.78426: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:00:38 -0400 (0:00:00.598) 0:00:25.101 ***** 34052 1727204438.78514: entering _queue_task() for managed-node1/set_fact 34052 1727204438.79430: worker is 1 (out of 1 available) 34052 1727204438.79446: exiting _queue_task() for managed-node1/set_fact 34052 1727204438.79460: done queuing things up, now waiting for results queue to drain 34052 1727204438.79462: waiting for pending results... 34052 1727204438.80052: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34052 1727204438.80366: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b4 34052 1727204438.80478: variable 'ansible_search_path' from source: unknown 34052 1727204438.80483: variable 'ansible_search_path' from source: unknown 34052 1727204438.80521: calling self._execute() 34052 1727204438.80747: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.80755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.80768: variable 'omit' from source: magic vars 34052 1727204438.81765: variable 'ansible_distribution_major_version' from source: facts 34052 1727204438.81890: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204438.82060: variable 'nm_profile_exists' from source: set_fact 34052 1727204438.82380: Evaluated conditional (nm_profile_exists.rc == 0): True 34052 1727204438.82389: variable 'omit' from source: magic vars 34052 1727204438.82657: variable 'omit' from source: magic vars 34052 1727204438.82751: variable 'omit' from source: magic vars 34052 1727204438.82954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204438.83073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204438.83278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204438.83419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204438.83434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204438.83470: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204438.83473: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.83479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.83892: Set connection var ansible_connection to ssh 34052 1727204438.83902: Set connection var ansible_timeout to 10 34052 1727204438.83910: Set connection var ansible_pipelining to False 34052 1727204438.83913: Set connection var ansible_shell_type to sh 34052 1727204438.83923: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204438.84273: Set connection var ansible_shell_executable to /bin/sh 34052 1727204438.84307: variable 'ansible_shell_executable' from source: unknown 34052 1727204438.84310: variable 'ansible_connection' from source: unknown 34052 1727204438.84313: variable 'ansible_module_compression' from source: unknown 34052 1727204438.84316: variable 'ansible_shell_type' from source: unknown 34052 1727204438.84318: variable 'ansible_shell_executable' from source: unknown 34052 1727204438.84320: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.84325: variable 'ansible_pipelining' from source: unknown 34052 1727204438.84332: variable 'ansible_timeout' from source: unknown 34052 1727204438.84336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.85040: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204438.85049: variable 'omit' from source: magic vars 34052 1727204438.85052: starting attempt loop 34052 1727204438.85054: running the handler 34052 1727204438.85143: handler run complete 34052 1727204438.85302: attempt loop complete, returning result 34052 1727204438.85306: _execute() done 34052 1727204438.85309: dumping result to json 34052 1727204438.85314: done dumping result, returning 34052 1727204438.85325: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-66a4-e2a3-0000000004b4] 34052 1727204438.85333: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b4 34052 1727204438.85672: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b4 34052 1727204438.85676: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 34052 1727204438.85929: no more pending results, returning what we have 34052 1727204438.85931: results queue empty 34052 1727204438.85932: checking for any_errors_fatal 34052 1727204438.85940: done checking for any_errors_fatal 34052 1727204438.85941: checking for max_fail_percentage 34052 1727204438.85943: done checking for max_fail_percentage 34052 1727204438.85944: checking to see if all hosts have failed and the running result is not ok 34052 1727204438.85945: done checking to see if all hosts have failed 34052 1727204438.85946: getting the remaining hosts for this loop 34052 1727204438.85948: done getting the remaining hosts for this loop 34052 1727204438.85953: getting the next task for host managed-node1 34052 1727204438.85962: done getting next task for host managed-node1 34052 1727204438.85966: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 34052 1727204438.85971: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204438.85975: getting variables 34052 1727204438.85976: in VariableManager get_vars() 34052 1727204438.86016: Calling all_inventory to load vars for managed-node1 34052 1727204438.86019: Calling groups_inventory to load vars for managed-node1 34052 1727204438.86021: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204438.86032: Calling all_plugins_play to load vars for managed-node1 34052 1727204438.86035: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204438.86038: Calling groups_plugins_play to load vars for managed-node1 34052 1727204438.91046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204438.95671: done with get_vars() 34052 1727204438.95708: done getting variables 34052 1727204438.95782: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204438.95916: variable 'profile' from source: include params 34052 1727204438.95921: variable 'interface' from source: play vars 34052 1727204438.95994: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:00:38 -0400 (0:00:00.175) 0:00:25.277 ***** 34052 1727204438.96036: entering _queue_task() for managed-node1/command 34052 1727204438.96541: worker is 1 (out of 1 available) 34052 1727204438.96553: exiting _queue_task() for managed-node1/command 34052 1727204438.96570: done queuing things up, now waiting for results queue to drain 34052 1727204438.96572: waiting for pending results... 34052 1727204438.96813: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-veth0 34052 1727204438.97675: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b6 34052 1727204438.97680: variable 'ansible_search_path' from source: unknown 34052 1727204438.97684: variable 'ansible_search_path' from source: unknown 34052 1727204438.97687: calling self._execute() 34052 1727204438.97690: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204438.97692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204438.97695: variable 'omit' from source: magic vars 34052 1727204438.98863: variable 'ansible_distribution_major_version' from source: facts 34052 1727204438.98870: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204438.99182: variable 'profile_stat' from source: set_fact 34052 1727204438.99301: Evaluated conditional (profile_stat.stat.exists): False 34052 1727204438.99305: when evaluation is False, skipping this task 34052 1727204438.99308: _execute() done 34052 1727204438.99310: dumping result to json 34052 1727204438.99313: done dumping result, returning 34052 1727204438.99320: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-veth0 [127b8e07-fff9-66a4-e2a3-0000000004b6] 34052 1727204438.99325: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b6 34052 1727204438.99812: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b6 34052 1727204438.99816: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34052 1727204438.99883: no more pending results, returning what we have 34052 1727204438.99887: results queue empty 34052 1727204438.99888: checking for any_errors_fatal 34052 1727204438.99897: done checking for any_errors_fatal 34052 1727204438.99898: checking for max_fail_percentage 34052 1727204438.99900: done checking for max_fail_percentage 34052 1727204438.99901: checking to see if all hosts have failed and the running result is not ok 34052 1727204438.99901: done checking to see if all hosts have failed 34052 1727204438.99902: getting the remaining hosts for this loop 34052 1727204438.99904: done getting the remaining hosts for this loop 34052 1727204438.99909: getting the next task for host managed-node1 34052 1727204438.99917: done getting next task for host managed-node1 34052 1727204438.99920: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 34052 1727204438.99931: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204438.99938: getting variables 34052 1727204438.99939: in VariableManager get_vars() 34052 1727204439.00000: Calling all_inventory to load vars for managed-node1 34052 1727204439.00003: Calling groups_inventory to load vars for managed-node1 34052 1727204439.00006: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.00023: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.00026: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.00030: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.02033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.06001: done with get_vars() 34052 1727204439.06046: done getting variables 34052 1727204439.06119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.06254: variable 'profile' from source: include params 34052 1727204439.06259: variable 'interface' from source: play vars 34052 1727204439.06329: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.103) 0:00:25.380 ***** 34052 1727204439.06371: entering _queue_task() for managed-node1/set_fact 34052 1727204439.06900: worker is 1 (out of 1 available) 34052 1727204439.06914: exiting _queue_task() for managed-node1/set_fact 34052 1727204439.06926: done queuing things up, now waiting for results queue to drain 34052 1727204439.06928: waiting for pending results... 34052 1727204439.07141: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 34052 1727204439.07301: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b7 34052 1727204439.07414: variable 'ansible_search_path' from source: unknown 34052 1727204439.07424: variable 'ansible_search_path' from source: unknown 34052 1727204439.07588: calling self._execute() 34052 1727204439.07890: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.07894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.07898: variable 'omit' from source: magic vars 34052 1727204439.08376: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.08397: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.08549: variable 'profile_stat' from source: set_fact 34052 1727204439.08579: Evaluated conditional (profile_stat.stat.exists): False 34052 1727204439.08587: when evaluation is False, skipping this task 34052 1727204439.08594: _execute() done 34052 1727204439.08601: dumping result to json 34052 1727204439.08608: done dumping result, returning 34052 1727204439.08619: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 [127b8e07-fff9-66a4-e2a3-0000000004b7] 34052 1727204439.08628: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b7 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34052 1727204439.08922: no more pending results, returning what we have 34052 1727204439.08927: results queue empty 34052 1727204439.08928: checking for any_errors_fatal 34052 1727204439.08935: done checking for any_errors_fatal 34052 1727204439.08936: checking for max_fail_percentage 34052 1727204439.08938: done checking for max_fail_percentage 34052 1727204439.08939: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.08940: done checking to see if all hosts have failed 34052 1727204439.08941: getting the remaining hosts for this loop 34052 1727204439.08943: done getting the remaining hosts for this loop 34052 1727204439.08948: getting the next task for host managed-node1 34052 1727204439.08955: done getting next task for host managed-node1 34052 1727204439.08958: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 34052 1727204439.08964: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.08971: getting variables 34052 1727204439.08973: in VariableManager get_vars() 34052 1727204439.09022: Calling all_inventory to load vars for managed-node1 34052 1727204439.09025: Calling groups_inventory to load vars for managed-node1 34052 1727204439.09028: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.09044: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.09047: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.09050: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.09621: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b7 34052 1727204439.09625: WORKER PROCESS EXITING 34052 1727204439.20721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.22078: done with get_vars() 34052 1727204439.22114: done getting variables 34052 1727204439.22174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.22281: variable 'profile' from source: include params 34052 1727204439.22285: variable 'interface' from source: play vars 34052 1727204439.22349: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.160) 0:00:25.541 ***** 34052 1727204439.22385: entering _queue_task() for managed-node1/command 34052 1727204439.23045: worker is 1 (out of 1 available) 34052 1727204439.23083: exiting _queue_task() for managed-node1/command 34052 1727204439.23206: done queuing things up, now waiting for results queue to drain 34052 1727204439.23210: waiting for pending results... 34052 1727204439.23451: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-veth0 34052 1727204439.23622: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b8 34052 1727204439.23655: variable 'ansible_search_path' from source: unknown 34052 1727204439.23664: variable 'ansible_search_path' from source: unknown 34052 1727204439.23717: calling self._execute() 34052 1727204439.23928: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.23976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.24033: variable 'omit' from source: magic vars 34052 1727204439.25187: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.25386: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.25638: variable 'profile_stat' from source: set_fact 34052 1727204439.25741: Evaluated conditional (profile_stat.stat.exists): False 34052 1727204439.25782: when evaluation is False, skipping this task 34052 1727204439.25886: _execute() done 34052 1727204439.25890: dumping result to json 34052 1727204439.25893: done dumping result, returning 34052 1727204439.25895: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-veth0 [127b8e07-fff9-66a4-e2a3-0000000004b8] 34052 1727204439.25898: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b8 34052 1727204439.26305: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b8 34052 1727204439.26309: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34052 1727204439.26377: no more pending results, returning what we have 34052 1727204439.26381: results queue empty 34052 1727204439.26382: checking for any_errors_fatal 34052 1727204439.26388: done checking for any_errors_fatal 34052 1727204439.26389: checking for max_fail_percentage 34052 1727204439.26390: done checking for max_fail_percentage 34052 1727204439.26393: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.26394: done checking to see if all hosts have failed 34052 1727204439.26395: getting the remaining hosts for this loop 34052 1727204439.26396: done getting the remaining hosts for this loop 34052 1727204439.26401: getting the next task for host managed-node1 34052 1727204439.26408: done getting next task for host managed-node1 34052 1727204439.26411: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 34052 1727204439.26418: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.26423: getting variables 34052 1727204439.26424: in VariableManager get_vars() 34052 1727204439.26472: Calling all_inventory to load vars for managed-node1 34052 1727204439.26475: Calling groups_inventory to load vars for managed-node1 34052 1727204439.26478: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.26492: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.26495: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.26499: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.28916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.32933: done with get_vars() 34052 1727204439.33010: done getting variables 34052 1727204439.33206: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.33335: variable 'profile' from source: include params 34052 1727204439.33383: variable 'interface' from source: play vars 34052 1727204439.33531: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.112) 0:00:25.653 ***** 34052 1727204439.33599: entering _queue_task() for managed-node1/set_fact 34052 1727204439.34654: worker is 1 (out of 1 available) 34052 1727204439.34672: exiting _queue_task() for managed-node1/set_fact 34052 1727204439.34689: done queuing things up, now waiting for results queue to drain 34052 1727204439.34691: waiting for pending results... 34052 1727204439.35372: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-veth0 34052 1727204439.35897: in run() - task 127b8e07-fff9-66a4-e2a3-0000000004b9 34052 1727204439.35902: variable 'ansible_search_path' from source: unknown 34052 1727204439.35906: variable 'ansible_search_path' from source: unknown 34052 1727204439.36002: calling self._execute() 34052 1727204439.36185: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.36280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.36291: variable 'omit' from source: magic vars 34052 1727204439.37392: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.37473: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.38063: variable 'profile_stat' from source: set_fact 34052 1727204439.38088: Evaluated conditional (profile_stat.stat.exists): False 34052 1727204439.38097: when evaluation is False, skipping this task 34052 1727204439.38101: _execute() done 34052 1727204439.38106: dumping result to json 34052 1727204439.38143: done dumping result, returning 34052 1727204439.38151: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-veth0 [127b8e07-fff9-66a4-e2a3-0000000004b9] 34052 1727204439.38156: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b9 34052 1727204439.38291: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000004b9 34052 1727204439.38295: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34052 1727204439.38353: no more pending results, returning what we have 34052 1727204439.38357: results queue empty 34052 1727204439.38358: checking for any_errors_fatal 34052 1727204439.38363: done checking for any_errors_fatal 34052 1727204439.38363: checking for max_fail_percentage 34052 1727204439.38367: done checking for max_fail_percentage 34052 1727204439.38368: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.38369: done checking to see if all hosts have failed 34052 1727204439.38369: getting the remaining hosts for this loop 34052 1727204439.38372: done getting the remaining hosts for this loop 34052 1727204439.38376: getting the next task for host managed-node1 34052 1727204439.38384: done getting next task for host managed-node1 34052 1727204439.38389: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 34052 1727204439.38392: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.38398: getting variables 34052 1727204439.38400: in VariableManager get_vars() 34052 1727204439.38451: Calling all_inventory to load vars for managed-node1 34052 1727204439.38455: Calling groups_inventory to load vars for managed-node1 34052 1727204439.38458: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.38722: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.38726: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.38731: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.43159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.49279: done with get_vars() 34052 1727204439.49327: done getting variables 34052 1727204439.49401: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.49534: variable 'profile' from source: include params 34052 1727204439.49542: variable 'interface' from source: play vars 34052 1727204439.49598: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.160) 0:00:25.813 ***** 34052 1727204439.49624: entering _queue_task() for managed-node1/assert 34052 1727204439.49928: worker is 1 (out of 1 available) 34052 1727204439.49945: exiting _queue_task() for managed-node1/assert 34052 1727204439.49958: done queuing things up, now waiting for results queue to drain 34052 1727204439.49960: waiting for pending results... 34052 1727204439.50162: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'veth0' 34052 1727204439.50250: in run() - task 127b8e07-fff9-66a4-e2a3-0000000003b9 34052 1727204439.50263: variable 'ansible_search_path' from source: unknown 34052 1727204439.50268: variable 'ansible_search_path' from source: unknown 34052 1727204439.50305: calling self._execute() 34052 1727204439.50387: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.50391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.50401: variable 'omit' from source: magic vars 34052 1727204439.50716: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.50728: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.50732: variable 'omit' from source: magic vars 34052 1727204439.50770: variable 'omit' from source: magic vars 34052 1727204439.50852: variable 'profile' from source: include params 34052 1727204439.50857: variable 'interface' from source: play vars 34052 1727204439.50906: variable 'interface' from source: play vars 34052 1727204439.50922: variable 'omit' from source: magic vars 34052 1727204439.50958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204439.50996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204439.51044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204439.51052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.51066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.51145: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204439.51149: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.51151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.51282: Set connection var ansible_connection to ssh 34052 1727204439.51291: Set connection var ansible_timeout to 10 34052 1727204439.51298: Set connection var ansible_pipelining to False 34052 1727204439.51301: Set connection var ansible_shell_type to sh 34052 1727204439.51402: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204439.51406: Set connection var ansible_shell_executable to /bin/sh 34052 1727204439.51409: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.51411: variable 'ansible_connection' from source: unknown 34052 1727204439.51413: variable 'ansible_module_compression' from source: unknown 34052 1727204439.51416: variable 'ansible_shell_type' from source: unknown 34052 1727204439.51418: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.51420: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.51422: variable 'ansible_pipelining' from source: unknown 34052 1727204439.51427: variable 'ansible_timeout' from source: unknown 34052 1727204439.51430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.51575: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204439.51579: variable 'omit' from source: magic vars 34052 1727204439.51581: starting attempt loop 34052 1727204439.51584: running the handler 34052 1727204439.51765: variable 'lsr_net_profile_exists' from source: set_fact 34052 1727204439.51770: Evaluated conditional (lsr_net_profile_exists): True 34052 1727204439.51773: handler run complete 34052 1727204439.51775: attempt loop complete, returning result 34052 1727204439.51777: _execute() done 34052 1727204439.51780: dumping result to json 34052 1727204439.51782: done dumping result, returning 34052 1727204439.51791: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'veth0' [127b8e07-fff9-66a4-e2a3-0000000003b9] 34052 1727204439.51794: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003b9 34052 1727204439.51869: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003b9 34052 1727204439.51872: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204439.51923: no more pending results, returning what we have 34052 1727204439.51929: results queue empty 34052 1727204439.51930: checking for any_errors_fatal 34052 1727204439.51938: done checking for any_errors_fatal 34052 1727204439.51938: checking for max_fail_percentage 34052 1727204439.51940: done checking for max_fail_percentage 34052 1727204439.51941: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.51942: done checking to see if all hosts have failed 34052 1727204439.51943: getting the remaining hosts for this loop 34052 1727204439.51944: done getting the remaining hosts for this loop 34052 1727204439.51949: getting the next task for host managed-node1 34052 1727204439.51955: done getting next task for host managed-node1 34052 1727204439.51958: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 34052 1727204439.51961: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.52071: getting variables 34052 1727204439.52073: in VariableManager get_vars() 34052 1727204439.52117: Calling all_inventory to load vars for managed-node1 34052 1727204439.52120: Calling groups_inventory to load vars for managed-node1 34052 1727204439.52123: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.52136: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.52139: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.52146: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.54358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.55989: done with get_vars() 34052 1727204439.56032: done getting variables 34052 1727204439.56134: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.56273: variable 'profile' from source: include params 34052 1727204439.56278: variable 'interface' from source: play vars 34052 1727204439.56354: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.067) 0:00:25.881 ***** 34052 1727204439.56400: entering _queue_task() for managed-node1/assert 34052 1727204439.56830: worker is 1 (out of 1 available) 34052 1727204439.56845: exiting _queue_task() for managed-node1/assert 34052 1727204439.56859: done queuing things up, now waiting for results queue to drain 34052 1727204439.56861: waiting for pending results... 34052 1727204439.57354: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'veth0' 34052 1727204439.57361: in run() - task 127b8e07-fff9-66a4-e2a3-0000000003ba 34052 1727204439.57370: variable 'ansible_search_path' from source: unknown 34052 1727204439.57374: variable 'ansible_search_path' from source: unknown 34052 1727204439.57383: calling self._execute() 34052 1727204439.57491: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.57495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.57514: variable 'omit' from source: magic vars 34052 1727204439.57847: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.57857: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.57864: variable 'omit' from source: magic vars 34052 1727204439.57898: variable 'omit' from source: magic vars 34052 1727204439.58024: variable 'profile' from source: include params 34052 1727204439.58028: variable 'interface' from source: play vars 34052 1727204439.58091: variable 'interface' from source: play vars 34052 1727204439.58116: variable 'omit' from source: magic vars 34052 1727204439.58156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204439.58211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204439.58238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204439.58269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.58273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.58305: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204439.58311: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.58317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.58413: Set connection var ansible_connection to ssh 34052 1727204439.58422: Set connection var ansible_timeout to 10 34052 1727204439.58427: Set connection var ansible_pipelining to False 34052 1727204439.58433: Set connection var ansible_shell_type to sh 34052 1727204439.58441: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204439.58449: Set connection var ansible_shell_executable to /bin/sh 34052 1727204439.58471: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.58474: variable 'ansible_connection' from source: unknown 34052 1727204439.58481: variable 'ansible_module_compression' from source: unknown 34052 1727204439.58484: variable 'ansible_shell_type' from source: unknown 34052 1727204439.58487: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.58489: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.58492: variable 'ansible_pipelining' from source: unknown 34052 1727204439.58495: variable 'ansible_timeout' from source: unknown 34052 1727204439.58497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.58615: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204439.58626: variable 'omit' from source: magic vars 34052 1727204439.58632: starting attempt loop 34052 1727204439.58637: running the handler 34052 1727204439.58724: variable 'lsr_net_profile_ansible_managed' from source: set_fact 34052 1727204439.58732: Evaluated conditional (lsr_net_profile_ansible_managed): True 34052 1727204439.58741: handler run complete 34052 1727204439.58754: attempt loop complete, returning result 34052 1727204439.58758: _execute() done 34052 1727204439.58760: dumping result to json 34052 1727204439.58763: done dumping result, returning 34052 1727204439.58772: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'veth0' [127b8e07-fff9-66a4-e2a3-0000000003ba] 34052 1727204439.58775: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003ba 34052 1727204439.58870: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003ba 34052 1727204439.58873: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204439.58931: no more pending results, returning what we have 34052 1727204439.58934: results queue empty 34052 1727204439.58935: checking for any_errors_fatal 34052 1727204439.58942: done checking for any_errors_fatal 34052 1727204439.58942: checking for max_fail_percentage 34052 1727204439.58944: done checking for max_fail_percentage 34052 1727204439.58945: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.58946: done checking to see if all hosts have failed 34052 1727204439.58946: getting the remaining hosts for this loop 34052 1727204439.58949: done getting the remaining hosts for this loop 34052 1727204439.58953: getting the next task for host managed-node1 34052 1727204439.58959: done getting next task for host managed-node1 34052 1727204439.58962: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 34052 1727204439.58968: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.58973: getting variables 34052 1727204439.58975: in VariableManager get_vars() 34052 1727204439.59027: Calling all_inventory to load vars for managed-node1 34052 1727204439.59030: Calling groups_inventory to load vars for managed-node1 34052 1727204439.59033: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.59044: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.59047: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.59050: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.61364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.63144: done with get_vars() 34052 1727204439.63179: done getting variables 34052 1727204439.63233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204439.63329: variable 'profile' from source: include params 34052 1727204439.63332: variable 'interface' from source: play vars 34052 1727204439.63380: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.070) 0:00:25.951 ***** 34052 1727204439.63411: entering _queue_task() for managed-node1/assert 34052 1727204439.63713: worker is 1 (out of 1 available) 34052 1727204439.63732: exiting _queue_task() for managed-node1/assert 34052 1727204439.63746: done queuing things up, now waiting for results queue to drain 34052 1727204439.63748: waiting for pending results... 34052 1727204439.63945: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in veth0 34052 1727204439.64033: in run() - task 127b8e07-fff9-66a4-e2a3-0000000003bb 34052 1727204439.64045: variable 'ansible_search_path' from source: unknown 34052 1727204439.64049: variable 'ansible_search_path' from source: unknown 34052 1727204439.64086: calling self._execute() 34052 1727204439.64172: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.64176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.64191: variable 'omit' from source: magic vars 34052 1727204439.64692: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.64700: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.64772: variable 'omit' from source: magic vars 34052 1727204439.64776: variable 'omit' from source: magic vars 34052 1727204439.64933: variable 'profile' from source: include params 34052 1727204439.64945: variable 'interface' from source: play vars 34052 1727204439.65043: variable 'interface' from source: play vars 34052 1727204439.65075: variable 'omit' from source: magic vars 34052 1727204439.65144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204439.65192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204439.65219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204439.65259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.65343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.65349: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204439.65351: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.65353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.65475: Set connection var ansible_connection to ssh 34052 1727204439.65583: Set connection var ansible_timeout to 10 34052 1727204439.65586: Set connection var ansible_pipelining to False 34052 1727204439.65588: Set connection var ansible_shell_type to sh 34052 1727204439.65590: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204439.65592: Set connection var ansible_shell_executable to /bin/sh 34052 1727204439.65594: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.65596: variable 'ansible_connection' from source: unknown 34052 1727204439.65598: variable 'ansible_module_compression' from source: unknown 34052 1727204439.65600: variable 'ansible_shell_type' from source: unknown 34052 1727204439.65602: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.65604: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.65606: variable 'ansible_pipelining' from source: unknown 34052 1727204439.65609: variable 'ansible_timeout' from source: unknown 34052 1727204439.65611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.65867: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204439.65913: variable 'omit' from source: magic vars 34052 1727204439.65917: starting attempt loop 34052 1727204439.65919: running the handler 34052 1727204439.66056: variable 'lsr_net_profile_fingerprint' from source: set_fact 34052 1727204439.66064: Evaluated conditional (lsr_net_profile_fingerprint): True 34052 1727204439.66071: handler run complete 34052 1727204439.66087: attempt loop complete, returning result 34052 1727204439.66090: _execute() done 34052 1727204439.66092: dumping result to json 34052 1727204439.66099: done dumping result, returning 34052 1727204439.66107: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in veth0 [127b8e07-fff9-66a4-e2a3-0000000003bb] 34052 1727204439.66110: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003bb 34052 1727204439.66217: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000003bb 34052 1727204439.66220: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204439.66303: no more pending results, returning what we have 34052 1727204439.66309: results queue empty 34052 1727204439.66310: checking for any_errors_fatal 34052 1727204439.66318: done checking for any_errors_fatal 34052 1727204439.66318: checking for max_fail_percentage 34052 1727204439.66320: done checking for max_fail_percentage 34052 1727204439.66321: checking to see if all hosts have failed and the running result is not ok 34052 1727204439.66322: done checking to see if all hosts have failed 34052 1727204439.66323: getting the remaining hosts for this loop 34052 1727204439.66325: done getting the remaining hosts for this loop 34052 1727204439.66331: getting the next task for host managed-node1 34052 1727204439.66339: done getting next task for host managed-node1 34052 1727204439.66342: ^ task is: TASK: Get ip address information 34052 1727204439.66344: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204439.66350: getting variables 34052 1727204439.66351: in VariableManager get_vars() 34052 1727204439.66398: Calling all_inventory to load vars for managed-node1 34052 1727204439.66401: Calling groups_inventory to load vars for managed-node1 34052 1727204439.66404: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204439.66416: Calling all_plugins_play to load vars for managed-node1 34052 1727204439.66419: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204439.66422: Calling groups_plugins_play to load vars for managed-node1 34052 1727204439.67996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204439.69215: done with get_vars() 34052 1727204439.69244: done getting variables 34052 1727204439.69299: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Tuesday 24 September 2024 15:00:39 -0400 (0:00:00.059) 0:00:26.010 ***** 34052 1727204439.69324: entering _queue_task() for managed-node1/command 34052 1727204439.69627: worker is 1 (out of 1 available) 34052 1727204439.69643: exiting _queue_task() for managed-node1/command 34052 1727204439.69657: done queuing things up, now waiting for results queue to drain 34052 1727204439.69659: waiting for pending results... 34052 1727204439.69869: running TaskExecutor() for managed-node1/TASK: Get ip address information 34052 1727204439.69948: in run() - task 127b8e07-fff9-66a4-e2a3-00000000005e 34052 1727204439.69960: variable 'ansible_search_path' from source: unknown 34052 1727204439.69998: calling self._execute() 34052 1727204439.70085: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.70090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.70099: variable 'omit' from source: magic vars 34052 1727204439.70592: variable 'ansible_distribution_major_version' from source: facts 34052 1727204439.70598: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204439.70601: variable 'omit' from source: magic vars 34052 1727204439.70603: variable 'omit' from source: magic vars 34052 1727204439.70673: variable 'interface' from source: play vars 34052 1727204439.70699: variable 'omit' from source: magic vars 34052 1727204439.70833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204439.70837: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204439.70840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204439.70843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.70845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204439.70871: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204439.70874: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.70879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.70986: Set connection var ansible_connection to ssh 34052 1727204439.70995: Set connection var ansible_timeout to 10 34052 1727204439.71002: Set connection var ansible_pipelining to False 34052 1727204439.71005: Set connection var ansible_shell_type to sh 34052 1727204439.71015: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204439.71023: Set connection var ansible_shell_executable to /bin/sh 34052 1727204439.71069: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.71074: variable 'ansible_connection' from source: unknown 34052 1727204439.71077: variable 'ansible_module_compression' from source: unknown 34052 1727204439.71080: variable 'ansible_shell_type' from source: unknown 34052 1727204439.71082: variable 'ansible_shell_executable' from source: unknown 34052 1727204439.71085: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204439.71087: variable 'ansible_pipelining' from source: unknown 34052 1727204439.71090: variable 'ansible_timeout' from source: unknown 34052 1727204439.71092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204439.71280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204439.71284: variable 'omit' from source: magic vars 34052 1727204439.71287: starting attempt loop 34052 1727204439.71289: running the handler 34052 1727204439.71292: _low_level_execute_command(): starting 34052 1727204439.71294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204439.71941: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.71947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.71952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.71996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204439.72004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204439.72006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204439.72077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204439.73879: stdout chunk (state=3): >>>/root <<< 34052 1727204439.74037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204439.74074: stderr chunk (state=3): >>><<< 34052 1727204439.74078: stdout chunk (state=3): >>><<< 34052 1727204439.74105: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204439.74136: _low_level_execute_command(): starting 34052 1727204439.74141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244 `" && echo ansible-tmp-1727204439.7410147-35820-145230707567244="` echo /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244 `" ) && sleep 0' 34052 1727204439.74729: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.74733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.74736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.74799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204439.74806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204439.74809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204439.74867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204439.76958: stdout chunk (state=3): >>>ansible-tmp-1727204439.7410147-35820-145230707567244=/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244 <<< 34052 1727204439.77067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204439.77137: stderr chunk (state=3): >>><<< 34052 1727204439.77140: stdout chunk (state=3): >>><<< 34052 1727204439.77156: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204439.7410147-35820-145230707567244=/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204439.77189: variable 'ansible_module_compression' from source: unknown 34052 1727204439.77242: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204439.77275: variable 'ansible_facts' from source: unknown 34052 1727204439.77341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py 34052 1727204439.77464: Sending initial data 34052 1727204439.77471: Sent initial data (156 bytes) 34052 1727204439.77968: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.77974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.77994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204439.77997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.78070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204439.78074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204439.78076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204439.78139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204439.79818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204439.79862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204439.79912: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp7_64fu_h /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py <<< 34052 1727204439.79915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py" <<< 34052 1727204439.79953: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 34052 1727204439.79968: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp7_64fu_h" to remote "/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py" <<< 34052 1727204439.80572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204439.80650: stderr chunk (state=3): >>><<< 34052 1727204439.80654: stdout chunk (state=3): >>><<< 34052 1727204439.80676: done transferring module to remote 34052 1727204439.80687: _low_level_execute_command(): starting 34052 1727204439.80694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/ /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py && sleep 0' 34052 1727204439.81200: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.81204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.81207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.81210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.81272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204439.81280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204439.81330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204439.83246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204439.83308: stderr chunk (state=3): >>><<< 34052 1727204439.83313: stdout chunk (state=3): >>><<< 34052 1727204439.83329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204439.83333: _low_level_execute_command(): starting 34052 1727204439.83335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/AnsiballZ_command.py && sleep 0' 34052 1727204439.83860: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204439.83864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.83866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204439.83874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204439.83877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204439.83913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204439.83917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204439.83996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.01975: stdout chunk (state=3): >>> {"changed": true, "stdout": "27: veth0@if26: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 5e:35:1f:22:31:71 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::5c35:1fff:fe22:3171/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 15:00:40.012762", "end": "2024-09-24 15:00:40.017067", "delta": "0:00:00.004305", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204440.03577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.03581: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204440.03759: stderr chunk (state=3): >>><<< 34052 1727204440.03798: stdout chunk (state=3): >>><<< 34052 1727204440.03853: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "27: veth0@if26: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 5e:35:1f:22:31:71 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::5c35:1fff:fe22:3171/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 15:00:40.012762", "end": "2024-09-24 15:00:40.017067", "delta": "0:00:00.004305", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204440.03971: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204440.03980: _low_level_execute_command(): starting 34052 1727204440.04011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204439.7410147-35820-145230707567244/ > /dev/null 2>&1 && sleep 0' 34052 1727204440.05543: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204440.05548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.05555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204440.05558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204440.05562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.05610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.07691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.07822: stderr chunk (state=3): >>><<< 34052 1727204440.07829: stdout chunk (state=3): >>><<< 34052 1727204440.07845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204440.07852: handler run complete 34052 1727204440.07882: Evaluated conditional (False): False 34052 1727204440.07931: attempt loop complete, returning result 34052 1727204440.07935: _execute() done 34052 1727204440.07937: dumping result to json 34052 1727204440.07940: done dumping result, returning 34052 1727204440.07971: done running TaskExecutor() for managed-node1/TASK: Get ip address information [127b8e07-fff9-66a4-e2a3-00000000005e] 34052 1727204440.07974: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005e 34052 1727204440.08258: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005e 34052 1727204440.08261: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.004305", "end": "2024-09-24 15:00:40.017067", "rc": 0, "start": "2024-09-24 15:00:40.012762" } STDOUT: 27: veth0@if26: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 5e:35:1f:22:31:71 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::5c35:1fff:fe22:3171/64 scope link noprefixroute valid_lft forever preferred_lft forever 34052 1727204440.08375: no more pending results, returning what we have 34052 1727204440.08379: results queue empty 34052 1727204440.08380: checking for any_errors_fatal 34052 1727204440.08476: done checking for any_errors_fatal 34052 1727204440.08478: checking for max_fail_percentage 34052 1727204440.08480: done checking for max_fail_percentage 34052 1727204440.08481: checking to see if all hosts have failed and the running result is not ok 34052 1727204440.08482: done checking to see if all hosts have failed 34052 1727204440.08483: getting the remaining hosts for this loop 34052 1727204440.08485: done getting the remaining hosts for this loop 34052 1727204440.08490: getting the next task for host managed-node1 34052 1727204440.08612: done getting next task for host managed-node1 34052 1727204440.08616: ^ task is: TASK: Show ip_addr 34052 1727204440.08618: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204440.08623: getting variables 34052 1727204440.08625: in VariableManager get_vars() 34052 1727204440.09072: Calling all_inventory to load vars for managed-node1 34052 1727204440.09076: Calling groups_inventory to load vars for managed-node1 34052 1727204440.09079: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204440.09093: Calling all_plugins_play to load vars for managed-node1 34052 1727204440.09096: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204440.09100: Calling groups_plugins_play to load vars for managed-node1 34052 1727204440.15737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204440.24056: done with get_vars() 34052 1727204440.24219: done getting variables 34052 1727204440.24541: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Tuesday 24 September 2024 15:00:40 -0400 (0:00:00.552) 0:00:26.563 ***** 34052 1727204440.24596: entering _queue_task() for managed-node1/debug 34052 1727204440.25859: worker is 1 (out of 1 available) 34052 1727204440.25876: exiting _queue_task() for managed-node1/debug 34052 1727204440.25893: done queuing things up, now waiting for results queue to drain 34052 1727204440.25895: waiting for pending results... 34052 1727204440.26990: running TaskExecutor() for managed-node1/TASK: Show ip_addr 34052 1727204440.27305: in run() - task 127b8e07-fff9-66a4-e2a3-00000000005f 34052 1727204440.27310: variable 'ansible_search_path' from source: unknown 34052 1727204440.27314: calling self._execute() 34052 1727204440.27775: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.27780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.27784: variable 'omit' from source: magic vars 34052 1727204440.28835: variable 'ansible_distribution_major_version' from source: facts 34052 1727204440.28928: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204440.28949: variable 'omit' from source: magic vars 34052 1727204440.29101: variable 'omit' from source: magic vars 34052 1727204440.29172: variable 'omit' from source: magic vars 34052 1727204440.29245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204440.29313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204440.29630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204440.29634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.29754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.29758: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204440.29760: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.29763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.30274: Set connection var ansible_connection to ssh 34052 1727204440.30493: Set connection var ansible_timeout to 10 34052 1727204440.30497: Set connection var ansible_pipelining to False 34052 1727204440.30499: Set connection var ansible_shell_type to sh 34052 1727204440.30501: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204440.30503: Set connection var ansible_shell_executable to /bin/sh 34052 1727204440.30505: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.30507: variable 'ansible_connection' from source: unknown 34052 1727204440.30510: variable 'ansible_module_compression' from source: unknown 34052 1727204440.30512: variable 'ansible_shell_type' from source: unknown 34052 1727204440.30514: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.30516: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.30518: variable 'ansible_pipelining' from source: unknown 34052 1727204440.30520: variable 'ansible_timeout' from source: unknown 34052 1727204440.30522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.31106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204440.31128: variable 'omit' from source: magic vars 34052 1727204440.31196: starting attempt loop 34052 1727204440.31203: running the handler 34052 1727204440.31591: variable 'ip_addr' from source: set_fact 34052 1727204440.31621: handler run complete 34052 1727204440.31652: attempt loop complete, returning result 34052 1727204440.31697: _execute() done 34052 1727204440.31706: dumping result to json 34052 1727204440.31715: done dumping result, returning 34052 1727204440.31733: done running TaskExecutor() for managed-node1/TASK: Show ip_addr [127b8e07-fff9-66a4-e2a3-00000000005f] 34052 1727204440.31741: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005f 34052 1727204440.32086: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000005f 34052 1727204440.32090: WORKER PROCESS EXITING ok: [managed-node1] => { "ip_addr.stdout": "27: veth0@if26: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 5e:35:1f:22:31:71 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::5c35:1fff:fe22:3171/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 34052 1727204440.32153: no more pending results, returning what we have 34052 1727204440.32157: results queue empty 34052 1727204440.32158: checking for any_errors_fatal 34052 1727204440.32171: done checking for any_errors_fatal 34052 1727204440.32172: checking for max_fail_percentage 34052 1727204440.32174: done checking for max_fail_percentage 34052 1727204440.32175: checking to see if all hosts have failed and the running result is not ok 34052 1727204440.32176: done checking to see if all hosts have failed 34052 1727204440.32177: getting the remaining hosts for this loop 34052 1727204440.32179: done getting the remaining hosts for this loop 34052 1727204440.32185: getting the next task for host managed-node1 34052 1727204440.32193: done getting next task for host managed-node1 34052 1727204440.32197: ^ task is: TASK: Assert ipv6 addresses are correctly set 34052 1727204440.32199: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204440.32204: getting variables 34052 1727204440.32206: in VariableManager get_vars() 34052 1727204440.32262: Calling all_inventory to load vars for managed-node1 34052 1727204440.32670: Calling groups_inventory to load vars for managed-node1 34052 1727204440.32675: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204440.32687: Calling all_plugins_play to load vars for managed-node1 34052 1727204440.32691: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204440.32694: Calling groups_plugins_play to load vars for managed-node1 34052 1727204440.36805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204440.41653: done with get_vars() 34052 1727204440.41811: done getting variables 34052 1727204440.41998: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Tuesday 24 September 2024 15:00:40 -0400 (0:00:00.175) 0:00:26.738 ***** 34052 1727204440.42150: entering _queue_task() for managed-node1/assert 34052 1727204440.42905: worker is 1 (out of 1 available) 34052 1727204440.42921: exiting _queue_task() for managed-node1/assert 34052 1727204440.42938: done queuing things up, now waiting for results queue to drain 34052 1727204440.42940: waiting for pending results... 34052 1727204440.43516: running TaskExecutor() for managed-node1/TASK: Assert ipv6 addresses are correctly set 34052 1727204440.43755: in run() - task 127b8e07-fff9-66a4-e2a3-000000000060 34052 1727204440.43760: variable 'ansible_search_path' from source: unknown 34052 1727204440.43767: calling self._execute() 34052 1727204440.44088: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.44093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.44097: variable 'omit' from source: magic vars 34052 1727204440.44591: variable 'ansible_distribution_major_version' from source: facts 34052 1727204440.44596: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204440.44599: variable 'omit' from source: magic vars 34052 1727204440.44602: variable 'omit' from source: magic vars 34052 1727204440.44636: variable 'omit' from source: magic vars 34052 1727204440.44860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204440.44864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204440.44868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204440.44871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.44873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.44876: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204440.44878: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.44880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.45020: Set connection var ansible_connection to ssh 34052 1727204440.45030: Set connection var ansible_timeout to 10 34052 1727204440.45036: Set connection var ansible_pipelining to False 34052 1727204440.45039: Set connection var ansible_shell_type to sh 34052 1727204440.45050: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204440.45061: Set connection var ansible_shell_executable to /bin/sh 34052 1727204440.45098: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.45101: variable 'ansible_connection' from source: unknown 34052 1727204440.45103: variable 'ansible_module_compression' from source: unknown 34052 1727204440.45106: variable 'ansible_shell_type' from source: unknown 34052 1727204440.45109: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.45111: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.45203: variable 'ansible_pipelining' from source: unknown 34052 1727204440.45206: variable 'ansible_timeout' from source: unknown 34052 1727204440.45210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.45329: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204440.45333: variable 'omit' from source: magic vars 34052 1727204440.45335: starting attempt loop 34052 1727204440.45338: running the handler 34052 1727204440.45546: variable 'ip_addr' from source: set_fact 34052 1727204440.45553: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 34052 1727204440.45754: variable 'ip_addr' from source: set_fact 34052 1727204440.45757: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 34052 1727204440.45870: variable 'ip_addr' from source: set_fact 34052 1727204440.45881: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 34052 1727204440.45889: handler run complete 34052 1727204440.45906: attempt loop complete, returning result 34052 1727204440.45909: _execute() done 34052 1727204440.45913: dumping result to json 34052 1727204440.45916: done dumping result, returning 34052 1727204440.45923: done running TaskExecutor() for managed-node1/TASK: Assert ipv6 addresses are correctly set [127b8e07-fff9-66a4-e2a3-000000000060] 34052 1727204440.45930: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000060 34052 1727204440.46116: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000060 34052 1727204440.46118: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204440.46200: no more pending results, returning what we have 34052 1727204440.46203: results queue empty 34052 1727204440.46204: checking for any_errors_fatal 34052 1727204440.46210: done checking for any_errors_fatal 34052 1727204440.46211: checking for max_fail_percentage 34052 1727204440.46212: done checking for max_fail_percentage 34052 1727204440.46213: checking to see if all hosts have failed and the running result is not ok 34052 1727204440.46214: done checking to see if all hosts have failed 34052 1727204440.46215: getting the remaining hosts for this loop 34052 1727204440.46216: done getting the remaining hosts for this loop 34052 1727204440.46220: getting the next task for host managed-node1 34052 1727204440.46231: done getting next task for host managed-node1 34052 1727204440.46234: ^ task is: TASK: Get ipv6 routes 34052 1727204440.46236: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204440.46240: getting variables 34052 1727204440.46241: in VariableManager get_vars() 34052 1727204440.46284: Calling all_inventory to load vars for managed-node1 34052 1727204440.46286: Calling groups_inventory to load vars for managed-node1 34052 1727204440.46289: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204440.46300: Calling all_plugins_play to load vars for managed-node1 34052 1727204440.46303: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204440.46306: Calling groups_plugins_play to load vars for managed-node1 34052 1727204440.48709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204440.51094: done with get_vars() 34052 1727204440.51149: done getting variables 34052 1727204440.51301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Tuesday 24 September 2024 15:00:40 -0400 (0:00:00.091) 0:00:26.830 ***** 34052 1727204440.51342: entering _queue_task() for managed-node1/command 34052 1727204440.51991: worker is 1 (out of 1 available) 34052 1727204440.52008: exiting _queue_task() for managed-node1/command 34052 1727204440.52136: done queuing things up, now waiting for results queue to drain 34052 1727204440.52138: waiting for pending results... 34052 1727204440.52483: running TaskExecutor() for managed-node1/TASK: Get ipv6 routes 34052 1727204440.52489: in run() - task 127b8e07-fff9-66a4-e2a3-000000000061 34052 1727204440.52493: variable 'ansible_search_path' from source: unknown 34052 1727204440.52517: calling self._execute() 34052 1727204440.52642: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.52687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.52691: variable 'omit' from source: magic vars 34052 1727204440.53140: variable 'ansible_distribution_major_version' from source: facts 34052 1727204440.53234: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204440.53238: variable 'omit' from source: magic vars 34052 1727204440.53241: variable 'omit' from source: magic vars 34052 1727204440.53258: variable 'omit' from source: magic vars 34052 1727204440.53312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204440.53370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204440.53398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204440.53421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.53447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204440.53489: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204440.53498: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.53506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.53644: Set connection var ansible_connection to ssh 34052 1727204440.53670: Set connection var ansible_timeout to 10 34052 1727204440.53773: Set connection var ansible_pipelining to False 34052 1727204440.53776: Set connection var ansible_shell_type to sh 34052 1727204440.53780: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204440.53782: Set connection var ansible_shell_executable to /bin/sh 34052 1727204440.53785: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.53787: variable 'ansible_connection' from source: unknown 34052 1727204440.53789: variable 'ansible_module_compression' from source: unknown 34052 1727204440.53792: variable 'ansible_shell_type' from source: unknown 34052 1727204440.53794: variable 'ansible_shell_executable' from source: unknown 34052 1727204440.53796: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204440.53799: variable 'ansible_pipelining' from source: unknown 34052 1727204440.53801: variable 'ansible_timeout' from source: unknown 34052 1727204440.53803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204440.53971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204440.54028: variable 'omit' from source: magic vars 34052 1727204440.54032: starting attempt loop 34052 1727204440.54034: running the handler 34052 1727204440.54037: _low_level_execute_command(): starting 34052 1727204440.54053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204440.55003: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.55054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204440.55081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204440.55096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.55189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.57273: stdout chunk (state=3): >>>/root <<< 34052 1727204440.57277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.57280: stdout chunk (state=3): >>><<< 34052 1727204440.57282: stderr chunk (state=3): >>><<< 34052 1727204440.57285: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204440.57287: _low_level_execute_command(): starting 34052 1727204440.57289: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464 `" && echo ansible-tmp-1727204440.5724607-35857-59992268899464="` echo /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464 `" ) && sleep 0' 34052 1727204440.58995: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204440.59002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.59378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.59483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.61612: stdout chunk (state=3): >>>ansible-tmp-1727204440.5724607-35857-59992268899464=/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464 <<< 34052 1727204440.61772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.61824: stderr chunk (state=3): >>><<< 34052 1727204440.61827: stdout chunk (state=3): >>><<< 34052 1727204440.61854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204440.5724607-35857-59992268899464=/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204440.61900: variable 'ansible_module_compression' from source: unknown 34052 1727204440.62009: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204440.62013: variable 'ansible_facts' from source: unknown 34052 1727204440.62079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py 34052 1727204440.62327: Sending initial data 34052 1727204440.62354: Sent initial data (155 bytes) 34052 1727204440.63976: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204440.63991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204440.64339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.64605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.66490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204440.66555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204440.66621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpicrerkpx /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py <<< 34052 1727204440.66628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py" <<< 34052 1727204440.66663: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpicrerkpx" to remote "/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py" <<< 34052 1727204440.68259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.68389: stderr chunk (state=3): >>><<< 34052 1727204440.68393: stdout chunk (state=3): >>><<< 34052 1727204440.68419: done transferring module to remote 34052 1727204440.68434: _low_level_execute_command(): starting 34052 1727204440.68475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/ /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py && sleep 0' 34052 1727204440.69535: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204440.69558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.69676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.69718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.72009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.72014: stdout chunk (state=3): >>><<< 34052 1727204440.72016: stderr chunk (state=3): >>><<< 34052 1727204440.72020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204440.72031: _low_level_execute_command(): starting 34052 1727204440.72074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/AnsiballZ_command.py && sleep 0' 34052 1727204440.73269: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.73372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204440.73483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204440.73507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.73605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.91589: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:00:40.910138", "end": "2024-09-24 15:00:40.914171", "delta": "0:00:00.004033", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204440.93233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.93391: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204440.93396: stdout chunk (state=3): >>><<< 34052 1727204440.93398: stderr chunk (state=3): >>><<< 34052 1727204440.93601: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:00:40.910138", "end": "2024-09-24 15:00:40.914171", "delta": "0:00:00.004033", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204440.93689: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204440.93776: _low_level_execute_command(): starting 34052 1727204440.93780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204440.5724607-35857-59992268899464/ > /dev/null 2>&1 && sleep 0' 34052 1727204440.95223: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204440.95332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204440.95372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204440.95395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204440.95486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204440.97628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204440.97662: stdout chunk (state=3): >>><<< 34052 1727204440.97739: stderr chunk (state=3): >>><<< 34052 1727204440.97887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204440.97895: handler run complete 34052 1727204440.97897: Evaluated conditional (False): False 34052 1727204440.97900: attempt loop complete, returning result 34052 1727204440.97902: _execute() done 34052 1727204440.97904: dumping result to json 34052 1727204440.97906: done dumping result, returning 34052 1727204440.97908: done running TaskExecutor() for managed-node1/TASK: Get ipv6 routes [127b8e07-fff9-66a4-e2a3-000000000061] 34052 1727204440.97910: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000061 34052 1727204440.98136: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000061 34052 1727204440.98142: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.004033", "end": "2024-09-24 15:00:40.914171", "rc": 0, "start": "2024-09-24 15:00:40.910138" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 34052 1727204440.98247: no more pending results, returning what we have 34052 1727204440.98253: results queue empty 34052 1727204440.98254: checking for any_errors_fatal 34052 1727204440.98264: done checking for any_errors_fatal 34052 1727204440.98266: checking for max_fail_percentage 34052 1727204440.98268: done checking for max_fail_percentage 34052 1727204440.98269: checking to see if all hosts have failed and the running result is not ok 34052 1727204440.98272: done checking to see if all hosts have failed 34052 1727204440.98272: getting the remaining hosts for this loop 34052 1727204440.98275: done getting the remaining hosts for this loop 34052 1727204440.98280: getting the next task for host managed-node1 34052 1727204440.98287: done getting next task for host managed-node1 34052 1727204440.98289: ^ task is: TASK: Show ipv6_route 34052 1727204440.98291: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204440.98296: getting variables 34052 1727204440.98298: in VariableManager get_vars() 34052 1727204440.98417: Calling all_inventory to load vars for managed-node1 34052 1727204440.98421: Calling groups_inventory to load vars for managed-node1 34052 1727204440.98423: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204440.98489: Calling all_plugins_play to load vars for managed-node1 34052 1727204440.98493: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204440.98497: Calling groups_plugins_play to load vars for managed-node1 34052 1727204440.99904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204441.01341: done with get_vars() 34052 1727204441.01374: done getting variables 34052 1727204441.01463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.501) 0:00:27.332 ***** 34052 1727204441.01522: entering _queue_task() for managed-node1/debug 34052 1727204441.01953: worker is 1 (out of 1 available) 34052 1727204441.02151: exiting _queue_task() for managed-node1/debug 34052 1727204441.02167: done queuing things up, now waiting for results queue to drain 34052 1727204441.02169: waiting for pending results... 34052 1727204441.02349: running TaskExecutor() for managed-node1/TASK: Show ipv6_route 34052 1727204441.02390: in run() - task 127b8e07-fff9-66a4-e2a3-000000000062 34052 1727204441.02412: variable 'ansible_search_path' from source: unknown 34052 1727204441.02543: calling self._execute() 34052 1727204441.02586: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.02593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.02612: variable 'omit' from source: magic vars 34052 1727204441.03251: variable 'ansible_distribution_major_version' from source: facts 34052 1727204441.03299: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204441.03332: variable 'omit' from source: magic vars 34052 1727204441.03401: variable 'omit' from source: magic vars 34052 1727204441.03461: variable 'omit' from source: magic vars 34052 1727204441.03524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204441.03584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204441.03612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204441.03652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.03663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.03692: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204441.03696: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.03698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.03785: Set connection var ansible_connection to ssh 34052 1727204441.03793: Set connection var ansible_timeout to 10 34052 1727204441.03799: Set connection var ansible_pipelining to False 34052 1727204441.03802: Set connection var ansible_shell_type to sh 34052 1727204441.03809: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204441.03816: Set connection var ansible_shell_executable to /bin/sh 34052 1727204441.03842: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.03846: variable 'ansible_connection' from source: unknown 34052 1727204441.03849: variable 'ansible_module_compression' from source: unknown 34052 1727204441.03852: variable 'ansible_shell_type' from source: unknown 34052 1727204441.03854: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.03858: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.03860: variable 'ansible_pipelining' from source: unknown 34052 1727204441.03863: variable 'ansible_timeout' from source: unknown 34052 1727204441.03865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.04011: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204441.04023: variable 'omit' from source: magic vars 34052 1727204441.04029: starting attempt loop 34052 1727204441.04032: running the handler 34052 1727204441.04157: variable 'ipv6_route' from source: set_fact 34052 1727204441.04172: handler run complete 34052 1727204441.04188: attempt loop complete, returning result 34052 1727204441.04191: _execute() done 34052 1727204441.04201: dumping result to json 34052 1727204441.04207: done dumping result, returning 34052 1727204441.04210: done running TaskExecutor() for managed-node1/TASK: Show ipv6_route [127b8e07-fff9-66a4-e2a3-000000000062] 34052 1727204441.04213: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000062 34052 1727204441.04307: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000062 34052 1727204441.04311: WORKER PROCESS EXITING ok: [managed-node1] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 34052 1727204441.04375: no more pending results, returning what we have 34052 1727204441.04379: results queue empty 34052 1727204441.04380: checking for any_errors_fatal 34052 1727204441.04391: done checking for any_errors_fatal 34052 1727204441.04391: checking for max_fail_percentage 34052 1727204441.04393: done checking for max_fail_percentage 34052 1727204441.04394: checking to see if all hosts have failed and the running result is not ok 34052 1727204441.04395: done checking to see if all hosts have failed 34052 1727204441.04396: getting the remaining hosts for this loop 34052 1727204441.04397: done getting the remaining hosts for this loop 34052 1727204441.04402: getting the next task for host managed-node1 34052 1727204441.04409: done getting next task for host managed-node1 34052 1727204441.04413: ^ task is: TASK: Assert default ipv6 route is set 34052 1727204441.04415: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204441.04418: getting variables 34052 1727204441.04432: in VariableManager get_vars() 34052 1727204441.04479: Calling all_inventory to load vars for managed-node1 34052 1727204441.04482: Calling groups_inventory to load vars for managed-node1 34052 1727204441.04484: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204441.04496: Calling all_plugins_play to load vars for managed-node1 34052 1727204441.04499: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204441.04502: Calling groups_plugins_play to load vars for managed-node1 34052 1727204441.05908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204441.08082: done with get_vars() 34052 1727204441.08112: done getting variables 34052 1727204441.08168: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.066) 0:00:27.399 ***** 34052 1727204441.08195: entering _queue_task() for managed-node1/assert 34052 1727204441.08519: worker is 1 (out of 1 available) 34052 1727204441.08537: exiting _queue_task() for managed-node1/assert 34052 1727204441.08553: done queuing things up, now waiting for results queue to drain 34052 1727204441.08555: waiting for pending results... 34052 1727204441.08786: running TaskExecutor() for managed-node1/TASK: Assert default ipv6 route is set 34052 1727204441.08915: in run() - task 127b8e07-fff9-66a4-e2a3-000000000063 34052 1727204441.08919: variable 'ansible_search_path' from source: unknown 34052 1727204441.08984: calling self._execute() 34052 1727204441.09072: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.09079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.09088: variable 'omit' from source: magic vars 34052 1727204441.09468: variable 'ansible_distribution_major_version' from source: facts 34052 1727204441.09478: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204441.09486: variable 'omit' from source: magic vars 34052 1727204441.09505: variable 'omit' from source: magic vars 34052 1727204441.09550: variable 'omit' from source: magic vars 34052 1727204441.09588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204441.09627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204441.09651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204441.09668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.09679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.09705: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204441.09709: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.09737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.09815: Set connection var ansible_connection to ssh 34052 1727204441.09818: Set connection var ansible_timeout to 10 34052 1727204441.09832: Set connection var ansible_pipelining to False 34052 1727204441.09835: Set connection var ansible_shell_type to sh 34052 1727204441.09840: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204441.09848: Set connection var ansible_shell_executable to /bin/sh 34052 1727204441.09872: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.09876: variable 'ansible_connection' from source: unknown 34052 1727204441.09878: variable 'ansible_module_compression' from source: unknown 34052 1727204441.09881: variable 'ansible_shell_type' from source: unknown 34052 1727204441.09884: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.09887: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.09891: variable 'ansible_pipelining' from source: unknown 34052 1727204441.09894: variable 'ansible_timeout' from source: unknown 34052 1727204441.09899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.10022: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204441.10036: variable 'omit' from source: magic vars 34052 1727204441.10039: starting attempt loop 34052 1727204441.10042: running the handler 34052 1727204441.10171: variable '__test_str' from source: task vars 34052 1727204441.10230: variable 'interface' from source: play vars 34052 1727204441.10240: variable 'ipv6_route' from source: set_fact 34052 1727204441.10250: Evaluated conditional (__test_str in ipv6_route.stdout): True 34052 1727204441.10257: handler run complete 34052 1727204441.10279: attempt loop complete, returning result 34052 1727204441.10282: _execute() done 34052 1727204441.10285: dumping result to json 34052 1727204441.10287: done dumping result, returning 34052 1727204441.10295: done running TaskExecutor() for managed-node1/TASK: Assert default ipv6 route is set [127b8e07-fff9-66a4-e2a3-000000000063] 34052 1727204441.10301: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000063 34052 1727204441.10400: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000063 34052 1727204441.10403: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34052 1727204441.10460: no more pending results, returning what we have 34052 1727204441.10463: results queue empty 34052 1727204441.10464: checking for any_errors_fatal 34052 1727204441.10472: done checking for any_errors_fatal 34052 1727204441.10473: checking for max_fail_percentage 34052 1727204441.10474: done checking for max_fail_percentage 34052 1727204441.10475: checking to see if all hosts have failed and the running result is not ok 34052 1727204441.10476: done checking to see if all hosts have failed 34052 1727204441.10477: getting the remaining hosts for this loop 34052 1727204441.10479: done getting the remaining hosts for this loop 34052 1727204441.10484: getting the next task for host managed-node1 34052 1727204441.10490: done getting next task for host managed-node1 34052 1727204441.10493: ^ task is: TASK: Ensure ping6 command is present 34052 1727204441.10494: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204441.10499: getting variables 34052 1727204441.10500: in VariableManager get_vars() 34052 1727204441.10548: Calling all_inventory to load vars for managed-node1 34052 1727204441.10554: Calling groups_inventory to load vars for managed-node1 34052 1727204441.10557: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204441.10577: Calling all_plugins_play to load vars for managed-node1 34052 1727204441.10582: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204441.10586: Calling groups_plugins_play to load vars for managed-node1 34052 1727204441.11720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204441.13742: done with get_vars() 34052 1727204441.13790: done getting variables 34052 1727204441.13907: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.057) 0:00:27.456 ***** 34052 1727204441.13953: entering _queue_task() for managed-node1/package 34052 1727204441.14367: worker is 1 (out of 1 available) 34052 1727204441.14385: exiting _queue_task() for managed-node1/package 34052 1727204441.14406: done queuing things up, now waiting for results queue to drain 34052 1727204441.14409: waiting for pending results... 34052 1727204441.14685: running TaskExecutor() for managed-node1/TASK: Ensure ping6 command is present 34052 1727204441.14838: in run() - task 127b8e07-fff9-66a4-e2a3-000000000064 34052 1727204441.14843: variable 'ansible_search_path' from source: unknown 34052 1727204441.14881: calling self._execute() 34052 1727204441.14971: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.14975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.14986: variable 'omit' from source: magic vars 34052 1727204441.15313: variable 'ansible_distribution_major_version' from source: facts 34052 1727204441.15324: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204441.15333: variable 'omit' from source: magic vars 34052 1727204441.15351: variable 'omit' from source: magic vars 34052 1727204441.15634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204441.18759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204441.18763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204441.19001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204441.19055: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204441.19085: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204441.19376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204441.19412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204441.19439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204441.19483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204441.19496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204441.19641: variable '__network_is_ostree' from source: set_fact 34052 1727204441.19647: variable 'omit' from source: magic vars 34052 1727204441.19687: variable 'omit' from source: magic vars 34052 1727204441.19720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204441.19774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204441.19798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204441.19815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.19825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204441.19861: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204441.19865: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.19869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.20070: Set connection var ansible_connection to ssh 34052 1727204441.20073: Set connection var ansible_timeout to 10 34052 1727204441.20076: Set connection var ansible_pipelining to False 34052 1727204441.20078: Set connection var ansible_shell_type to sh 34052 1727204441.20081: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204441.20083: Set connection var ansible_shell_executable to /bin/sh 34052 1727204441.20085: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.20087: variable 'ansible_connection' from source: unknown 34052 1727204441.20089: variable 'ansible_module_compression' from source: unknown 34052 1727204441.20091: variable 'ansible_shell_type' from source: unknown 34052 1727204441.20094: variable 'ansible_shell_executable' from source: unknown 34052 1727204441.20096: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204441.20098: variable 'ansible_pipelining' from source: unknown 34052 1727204441.20100: variable 'ansible_timeout' from source: unknown 34052 1727204441.20102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204441.20196: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204441.20209: variable 'omit' from source: magic vars 34052 1727204441.20215: starting attempt loop 34052 1727204441.20218: running the handler 34052 1727204441.20227: variable 'ansible_facts' from source: unknown 34052 1727204441.20230: variable 'ansible_facts' from source: unknown 34052 1727204441.20275: _low_level_execute_command(): starting 34052 1727204441.20289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204441.21054: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204441.21062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204441.21077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204441.21094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204441.21108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204441.21112: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204441.21158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204441.21162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204441.21170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204441.21172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204441.21177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204441.21255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204441.21267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204441.21284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204441.21297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204441.21387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204441.23280: stdout chunk (state=3): >>>/root <<< 34052 1727204441.23477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204441.23481: stdout chunk (state=3): >>><<< 34052 1727204441.23491: stderr chunk (state=3): >>><<< 34052 1727204441.23875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204441.23880: _low_level_execute_command(): starting 34052 1727204441.23884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936 `" && echo ansible-tmp-1727204441.2351272-35888-212297918105936="` echo /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936 `" ) && sleep 0' 34052 1727204441.25089: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204441.25493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204441.25595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204441.27698: stdout chunk (state=3): >>>ansible-tmp-1727204441.2351272-35888-212297918105936=/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936 <<< 34052 1727204441.27904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204441.27908: stderr chunk (state=3): >>><<< 34052 1727204441.27911: stdout chunk (state=3): >>><<< 34052 1727204441.27985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204441.2351272-35888-212297918105936=/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204441.27989: variable 'ansible_module_compression' from source: unknown 34052 1727204441.28097: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 34052 1727204441.28236: variable 'ansible_facts' from source: unknown 34052 1727204441.28596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py 34052 1727204441.29004: Sending initial data 34052 1727204441.29007: Sent initial data (152 bytes) 34052 1727204441.29813: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204441.29818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204441.29957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204441.29964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204441.30067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204441.31840: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204441.31939: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204441.31992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204441.32036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py" <<< 34052 1727204441.32158: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmplcemhei_ /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py <<< 34052 1727204441.32184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmplcemhei_" to remote "/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py" <<< 34052 1727204441.34276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204441.34498: stderr chunk (state=3): >>><<< 34052 1727204441.34502: stdout chunk (state=3): >>><<< 34052 1727204441.34548: done transferring module to remote 34052 1727204441.34610: _low_level_execute_command(): starting 34052 1727204441.34701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/ /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py && sleep 0' 34052 1727204441.36292: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204441.36475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204441.36574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204441.36601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204441.36625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204441.36646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204441.36804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204441.38842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204441.38950: stderr chunk (state=3): >>><<< 34052 1727204441.39075: stdout chunk (state=3): >>><<< 34052 1727204441.39079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204441.39082: _low_level_execute_command(): starting 34052 1727204441.39085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/AnsiballZ_dnf.py && sleep 0' 34052 1727204441.40674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204441.40886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204441.40907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204441.41015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.70896: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34052 1727204442.75830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204442.75843: stdout chunk (state=3): >>><<< 34052 1727204442.75863: stderr chunk (state=3): >>><<< 34052 1727204442.76046: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204442.76056: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204442.76060: _low_level_execute_command(): starting 34052 1727204442.76062: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204441.2351272-35888-212297918105936/ > /dev/null 2>&1 && sleep 0' 34052 1727204442.76690: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.76723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204442.76746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.76762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.76851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.78831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204442.78871: stderr chunk (state=3): >>><<< 34052 1727204442.78874: stdout chunk (state=3): >>><<< 34052 1727204442.78888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204442.78895: handler run complete 34052 1727204442.78927: attempt loop complete, returning result 34052 1727204442.78930: _execute() done 34052 1727204442.78933: dumping result to json 34052 1727204442.78935: done dumping result, returning 34052 1727204442.78944: done running TaskExecutor() for managed-node1/TASK: Ensure ping6 command is present [127b8e07-fff9-66a4-e2a3-000000000064] 34052 1727204442.78947: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000064 34052 1727204442.79057: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000064 34052 1727204442.79060: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34052 1727204442.79177: no more pending results, returning what we have 34052 1727204442.79181: results queue empty 34052 1727204442.79182: checking for any_errors_fatal 34052 1727204442.79188: done checking for any_errors_fatal 34052 1727204442.79189: checking for max_fail_percentage 34052 1727204442.79191: done checking for max_fail_percentage 34052 1727204442.79192: checking to see if all hosts have failed and the running result is not ok 34052 1727204442.79193: done checking to see if all hosts have failed 34052 1727204442.79194: getting the remaining hosts for this loop 34052 1727204442.79196: done getting the remaining hosts for this loop 34052 1727204442.79210: getting the next task for host managed-node1 34052 1727204442.79216: done getting next task for host managed-node1 34052 1727204442.79219: ^ task is: TASK: Test gateway can be pinged 34052 1727204442.79221: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204442.79224: getting variables 34052 1727204442.79226: in VariableManager get_vars() 34052 1727204442.79268: Calling all_inventory to load vars for managed-node1 34052 1727204442.79270: Calling groups_inventory to load vars for managed-node1 34052 1727204442.79272: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204442.79284: Calling all_plugins_play to load vars for managed-node1 34052 1727204442.79287: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204442.79289: Calling groups_plugins_play to load vars for managed-node1 34052 1727204442.80441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204442.82075: done with get_vars() 34052 1727204442.82118: done getting variables 34052 1727204442.82192: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Tuesday 24 September 2024 15:00:42 -0400 (0:00:01.682) 0:00:29.139 ***** 34052 1727204442.82225: entering _queue_task() for managed-node1/command 34052 1727204442.82635: worker is 1 (out of 1 available) 34052 1727204442.82650: exiting _queue_task() for managed-node1/command 34052 1727204442.82663: done queuing things up, now waiting for results queue to drain 34052 1727204442.82768: waiting for pending results... 34052 1727204442.83089: running TaskExecutor() for managed-node1/TASK: Test gateway can be pinged 34052 1727204442.83129: in run() - task 127b8e07-fff9-66a4-e2a3-000000000065 34052 1727204442.83154: variable 'ansible_search_path' from source: unknown 34052 1727204442.83210: calling self._execute() 34052 1727204442.83329: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204442.83344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204442.83368: variable 'omit' from source: magic vars 34052 1727204442.83850: variable 'ansible_distribution_major_version' from source: facts 34052 1727204442.83853: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204442.83856: variable 'omit' from source: magic vars 34052 1727204442.83858: variable 'omit' from source: magic vars 34052 1727204442.83917: variable 'omit' from source: magic vars 34052 1727204442.83928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204442.83988: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204442.84009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204442.84052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204442.84055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204442.84271: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204442.84274: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204442.84276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204442.84295: Set connection var ansible_connection to ssh 34052 1727204442.84299: Set connection var ansible_timeout to 10 34052 1727204442.84301: Set connection var ansible_pipelining to False 34052 1727204442.84303: Set connection var ansible_shell_type to sh 34052 1727204442.84306: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204442.84308: Set connection var ansible_shell_executable to /bin/sh 34052 1727204442.84310: variable 'ansible_shell_executable' from source: unknown 34052 1727204442.84313: variable 'ansible_connection' from source: unknown 34052 1727204442.84315: variable 'ansible_module_compression' from source: unknown 34052 1727204442.84318: variable 'ansible_shell_type' from source: unknown 34052 1727204442.84320: variable 'ansible_shell_executable' from source: unknown 34052 1727204442.84322: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204442.84324: variable 'ansible_pipelining' from source: unknown 34052 1727204442.84326: variable 'ansible_timeout' from source: unknown 34052 1727204442.84331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204442.84514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204442.84525: variable 'omit' from source: magic vars 34052 1727204442.84534: starting attempt loop 34052 1727204442.84537: running the handler 34052 1727204442.84557: _low_level_execute_command(): starting 34052 1727204442.84574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204442.85319: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.85328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.85368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204442.85373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.85485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.85488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.85593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.87383: stdout chunk (state=3): >>>/root <<< 34052 1727204442.87528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204442.87664: stderr chunk (state=3): >>><<< 34052 1727204442.87670: stdout chunk (state=3): >>><<< 34052 1727204442.87996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204442.88000: _low_level_execute_command(): starting 34052 1727204442.88003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485 `" && echo ansible-tmp-1727204442.8789227-35960-97953192212485="` echo /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485 `" ) && sleep 0' 34052 1727204442.88779: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.88787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.88813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204442.88825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.88900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.88908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.88976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.91085: stdout chunk (state=3): >>>ansible-tmp-1727204442.8789227-35960-97953192212485=/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485 <<< 34052 1727204442.91218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204442.91275: stderr chunk (state=3): >>><<< 34052 1727204442.91278: stdout chunk (state=3): >>><<< 34052 1727204442.91294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204442.8789227-35960-97953192212485=/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204442.91337: variable 'ansible_module_compression' from source: unknown 34052 1727204442.91472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204442.91477: variable 'ansible_facts' from source: unknown 34052 1727204442.91646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py 34052 1727204442.91860: Sending initial data 34052 1727204442.91864: Sent initial data (155 bytes) 34052 1727204442.92947: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.92985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.92989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.92991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.93046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204442.93049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.93052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.93115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.94834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204442.94879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204442.94949: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmphpoz4ntd /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py <<< 34052 1727204442.94953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py" <<< 34052 1727204442.95000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmphpoz4ntd" to remote "/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py" <<< 34052 1727204442.95712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204442.95809: stderr chunk (state=3): >>><<< 34052 1727204442.95813: stdout chunk (state=3): >>><<< 34052 1727204442.95846: done transferring module to remote 34052 1727204442.95854: _low_level_execute_command(): starting 34052 1727204442.95871: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/ /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py && sleep 0' 34052 1727204442.96600: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204442.96604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found <<< 34052 1727204442.96606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204442.96609: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.96611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.96628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.96671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.96751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204442.98741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204442.98781: stderr chunk (state=3): >>><<< 34052 1727204442.98785: stdout chunk (state=3): >>><<< 34052 1727204442.98800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204442.98803: _low_level_execute_command(): starting 34052 1727204442.98809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/AnsiballZ_command.py && sleep 0' 34052 1727204442.99307: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204442.99311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.99314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204442.99318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204442.99320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204442.99375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204442.99382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204442.99384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204442.99443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.16962: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:00:43.163871", "end": "2024-09-24 15:00:43.168495", "delta": "0:00:00.004624", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204443.18816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204443.18821: stdout chunk (state=3): >>><<< 34052 1727204443.18823: stderr chunk (state=3): >>><<< 34052 1727204443.18851: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:00:43.163871", "end": "2024-09-24 15:00:43.168495", "delta": "0:00:00.004624", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204443.18981: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204443.18985: _low_level_execute_command(): starting 34052 1727204443.18987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204442.8789227-35960-97953192212485/ > /dev/null 2>&1 && sleep 0' 34052 1727204443.19778: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204443.20013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.20117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.20176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204443.20180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.20259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.22450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204443.22454: stderr chunk (state=3): >>><<< 34052 1727204443.22457: stdout chunk (state=3): >>><<< 34052 1727204443.22460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204443.22462: handler run complete 34052 1727204443.22489: Evaluated conditional (False): False 34052 1727204443.22494: attempt loop complete, returning result 34052 1727204443.22497: _execute() done 34052 1727204443.22499: dumping result to json 34052 1727204443.22501: done dumping result, returning 34052 1727204443.22503: done running TaskExecutor() for managed-node1/TASK: Test gateway can be pinged [127b8e07-fff9-66a4-e2a3-000000000065] 34052 1727204443.22508: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000065 34052 1727204443.22652: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000065 34052 1727204443.22655: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.004624", "end": "2024-09-24 15:00:43.168495", "rc": 0, "start": "2024-09-24 15:00:43.163871" } STDOUT: PING 2001:db8::1 (2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms 34052 1727204443.22834: no more pending results, returning what we have 34052 1727204443.22838: results queue empty 34052 1727204443.22839: checking for any_errors_fatal 34052 1727204443.22848: done checking for any_errors_fatal 34052 1727204443.22848: checking for max_fail_percentage 34052 1727204443.22850: done checking for max_fail_percentage 34052 1727204443.22853: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.22854: done checking to see if all hosts have failed 34052 1727204443.22855: getting the remaining hosts for this loop 34052 1727204443.22857: done getting the remaining hosts for this loop 34052 1727204443.22864: getting the next task for host managed-node1 34052 1727204443.22874: done getting next task for host managed-node1 34052 1727204443.22877: ^ task is: TASK: TEARDOWN: remove profiles. 34052 1727204443.22879: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204443.22883: getting variables 34052 1727204443.22884: in VariableManager get_vars() 34052 1727204443.22931: Calling all_inventory to load vars for managed-node1 34052 1727204443.22934: Calling groups_inventory to load vars for managed-node1 34052 1727204443.22937: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.22950: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.22953: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.22957: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.25035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.26409: done with get_vars() 34052 1727204443.26445: done getting variables 34052 1727204443.26508: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.443) 0:00:29.582 ***** 34052 1727204443.26539: entering _queue_task() for managed-node1/debug 34052 1727204443.26947: worker is 1 (out of 1 available) 34052 1727204443.27171: exiting _queue_task() for managed-node1/debug 34052 1727204443.27185: done queuing things up, now waiting for results queue to drain 34052 1727204443.27187: waiting for pending results... 34052 1727204443.27592: running TaskExecutor() for managed-node1/TASK: TEARDOWN: remove profiles. 34052 1727204443.27599: in run() - task 127b8e07-fff9-66a4-e2a3-000000000066 34052 1727204443.27603: variable 'ansible_search_path' from source: unknown 34052 1727204443.27606: calling self._execute() 34052 1727204443.27672: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.27693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.27711: variable 'omit' from source: magic vars 34052 1727204443.28194: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.28217: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.28236: variable 'omit' from source: magic vars 34052 1727204443.28274: variable 'omit' from source: magic vars 34052 1727204443.28325: variable 'omit' from source: magic vars 34052 1727204443.28387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204443.28445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204443.28469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204443.28556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204443.28558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204443.28561: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204443.28563: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.28566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.28683: Set connection var ansible_connection to ssh 34052 1727204443.28698: Set connection var ansible_timeout to 10 34052 1727204443.28711: Set connection var ansible_pipelining to False 34052 1727204443.28718: Set connection var ansible_shell_type to sh 34052 1727204443.28732: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204443.28746: Set connection var ansible_shell_executable to /bin/sh 34052 1727204443.28782: variable 'ansible_shell_executable' from source: unknown 34052 1727204443.28791: variable 'ansible_connection' from source: unknown 34052 1727204443.28798: variable 'ansible_module_compression' from source: unknown 34052 1727204443.28806: variable 'ansible_shell_type' from source: unknown 34052 1727204443.28814: variable 'ansible_shell_executable' from source: unknown 34052 1727204443.28820: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.28828: variable 'ansible_pipelining' from source: unknown 34052 1727204443.28835: variable 'ansible_timeout' from source: unknown 34052 1727204443.28843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.29038: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204443.29042: variable 'omit' from source: magic vars 34052 1727204443.29045: starting attempt loop 34052 1727204443.29047: running the handler 34052 1727204443.29080: handler run complete 34052 1727204443.29098: attempt loop complete, returning result 34052 1727204443.29101: _execute() done 34052 1727204443.29104: dumping result to json 34052 1727204443.29114: done dumping result, returning 34052 1727204443.29117: done running TaskExecutor() for managed-node1/TASK: TEARDOWN: remove profiles. [127b8e07-fff9-66a4-e2a3-000000000066] 34052 1727204443.29119: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000066 ok: [managed-node1] => {} MSG: ################################################## 34052 1727204443.29276: no more pending results, returning what we have 34052 1727204443.29279: results queue empty 34052 1727204443.29280: checking for any_errors_fatal 34052 1727204443.29292: done checking for any_errors_fatal 34052 1727204443.29293: checking for max_fail_percentage 34052 1727204443.29294: done checking for max_fail_percentage 34052 1727204443.29295: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.29296: done checking to see if all hosts have failed 34052 1727204443.29297: getting the remaining hosts for this loop 34052 1727204443.29299: done getting the remaining hosts for this loop 34052 1727204443.29303: getting the next task for host managed-node1 34052 1727204443.29312: done getting next task for host managed-node1 34052 1727204443.29318: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34052 1727204443.29321: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204443.29345: getting variables 34052 1727204443.29347: in VariableManager get_vars() 34052 1727204443.29408: Calling all_inventory to load vars for managed-node1 34052 1727204443.29412: Calling groups_inventory to load vars for managed-node1 34052 1727204443.29414: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.29423: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000066 34052 1727204443.29427: WORKER PROCESS EXITING 34052 1727204443.29437: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.29439: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.29442: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.30555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.32506: done with get_vars() 34052 1727204443.32549: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.061) 0:00:29.643 ***** 34052 1727204443.32661: entering _queue_task() for managed-node1/include_tasks 34052 1727204443.33147: worker is 1 (out of 1 available) 34052 1727204443.33162: exiting _queue_task() for managed-node1/include_tasks 34052 1727204443.33180: done queuing things up, now waiting for results queue to drain 34052 1727204443.33186: waiting for pending results... 34052 1727204443.33638: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34052 1727204443.33963: in run() - task 127b8e07-fff9-66a4-e2a3-00000000006e 34052 1727204443.34116: variable 'ansible_search_path' from source: unknown 34052 1727204443.34122: variable 'ansible_search_path' from source: unknown 34052 1727204443.34130: calling self._execute() 34052 1727204443.34317: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.34322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.34343: variable 'omit' from source: magic vars 34052 1727204443.34852: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.34857: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.34861: _execute() done 34052 1727204443.34864: dumping result to json 34052 1727204443.34869: done dumping result, returning 34052 1727204443.34872: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-66a4-e2a3-00000000006e] 34052 1727204443.34874: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000006e 34052 1727204443.35033: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000006e 34052 1727204443.35036: WORKER PROCESS EXITING 34052 1727204443.35146: no more pending results, returning what we have 34052 1727204443.35154: in VariableManager get_vars() 34052 1727204443.35203: Calling all_inventory to load vars for managed-node1 34052 1727204443.35207: Calling groups_inventory to load vars for managed-node1 34052 1727204443.35209: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.35224: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.35226: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.35230: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.36914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.38969: done with get_vars() 34052 1727204443.38998: variable 'ansible_search_path' from source: unknown 34052 1727204443.38999: variable 'ansible_search_path' from source: unknown 34052 1727204443.39057: we have included files to process 34052 1727204443.39059: generating all_blocks data 34052 1727204443.39061: done generating all_blocks data 34052 1727204443.39074: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204443.39078: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204443.39082: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34052 1727204443.39821: done processing included file 34052 1727204443.39823: iterating over new_blocks loaded from include file 34052 1727204443.39824: in VariableManager get_vars() 34052 1727204443.39860: done with get_vars() 34052 1727204443.39863: filtering new block on tags 34052 1727204443.39886: done filtering new block on tags 34052 1727204443.39892: in VariableManager get_vars() 34052 1727204443.39927: done with get_vars() 34052 1727204443.39929: filtering new block on tags 34052 1727204443.39945: done filtering new block on tags 34052 1727204443.39947: in VariableManager get_vars() 34052 1727204443.39963: done with get_vars() 34052 1727204443.39969: filtering new block on tags 34052 1727204443.39995: done filtering new block on tags 34052 1727204443.39998: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 34052 1727204443.40003: extending task lists for all hosts with included blocks 34052 1727204443.40918: done extending task lists 34052 1727204443.40920: done processing included files 34052 1727204443.40921: results queue empty 34052 1727204443.40922: checking for any_errors_fatal 34052 1727204443.40928: done checking for any_errors_fatal 34052 1727204443.40929: checking for max_fail_percentage 34052 1727204443.40930: done checking for max_fail_percentage 34052 1727204443.40931: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.40932: done checking to see if all hosts have failed 34052 1727204443.40933: getting the remaining hosts for this loop 34052 1727204443.40936: done getting the remaining hosts for this loop 34052 1727204443.40939: getting the next task for host managed-node1 34052 1727204443.40945: done getting next task for host managed-node1 34052 1727204443.40948: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34052 1727204443.40952: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204443.40975: getting variables 34052 1727204443.40976: in VariableManager get_vars() 34052 1727204443.40997: Calling all_inventory to load vars for managed-node1 34052 1727204443.40999: Calling groups_inventory to load vars for managed-node1 34052 1727204443.41000: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.41005: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.41007: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.41009: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.49001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.50212: done with get_vars() 34052 1727204443.50242: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.176) 0:00:29.820 ***** 34052 1727204443.50311: entering _queue_task() for managed-node1/setup 34052 1727204443.50621: worker is 1 (out of 1 available) 34052 1727204443.50635: exiting _queue_task() for managed-node1/setup 34052 1727204443.50649: done queuing things up, now waiting for results queue to drain 34052 1727204443.50652: waiting for pending results... 34052 1727204443.51193: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34052 1727204443.51528: in run() - task 127b8e07-fff9-66a4-e2a3-000000000513 34052 1727204443.51555: variable 'ansible_search_path' from source: unknown 34052 1727204443.51587: variable 'ansible_search_path' from source: unknown 34052 1727204443.51756: calling self._execute() 34052 1727204443.51864: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.51877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.51914: variable 'omit' from source: magic vars 34052 1727204443.52855: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.52869: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.53439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204443.56467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204443.56526: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204443.56563: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204443.56594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204443.56615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204443.56687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204443.56709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204443.56727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204443.56762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204443.56776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204443.56820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204443.56841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204443.56861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204443.56891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204443.56903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204443.57133: variable '__network_required_facts' from source: role '' defaults 34052 1727204443.57136: variable 'ansible_facts' from source: unknown 34052 1727204443.58335: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34052 1727204443.58339: when evaluation is False, skipping this task 34052 1727204443.58342: _execute() done 34052 1727204443.58344: dumping result to json 34052 1727204443.58346: done dumping result, returning 34052 1727204443.58348: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-66a4-e2a3-000000000513] 34052 1727204443.58350: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000513 34052 1727204443.58423: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000513 34052 1727204443.58425: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204443.58483: no more pending results, returning what we have 34052 1727204443.58488: results queue empty 34052 1727204443.58488: checking for any_errors_fatal 34052 1727204443.58490: done checking for any_errors_fatal 34052 1727204443.58490: checking for max_fail_percentage 34052 1727204443.58492: done checking for max_fail_percentage 34052 1727204443.58493: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.58494: done checking to see if all hosts have failed 34052 1727204443.58494: getting the remaining hosts for this loop 34052 1727204443.58497: done getting the remaining hosts for this loop 34052 1727204443.58501: getting the next task for host managed-node1 34052 1727204443.58509: done getting next task for host managed-node1 34052 1727204443.58514: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34052 1727204443.58518: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204443.58536: getting variables 34052 1727204443.58537: in VariableManager get_vars() 34052 1727204443.58674: Calling all_inventory to load vars for managed-node1 34052 1727204443.58678: Calling groups_inventory to load vars for managed-node1 34052 1727204443.58680: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.58690: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.58697: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.58701: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.60480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.62847: done with get_vars() 34052 1727204443.62897: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.127) 0:00:29.947 ***** 34052 1727204443.63026: entering _queue_task() for managed-node1/stat 34052 1727204443.63453: worker is 1 (out of 1 available) 34052 1727204443.63478: exiting _queue_task() for managed-node1/stat 34052 1727204443.63493: done queuing things up, now waiting for results queue to drain 34052 1727204443.63495: waiting for pending results... 34052 1727204443.63888: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 34052 1727204443.63963: in run() - task 127b8e07-fff9-66a4-e2a3-000000000515 34052 1727204443.63983: variable 'ansible_search_path' from source: unknown 34052 1727204443.63987: variable 'ansible_search_path' from source: unknown 34052 1727204443.64025: calling self._execute() 34052 1727204443.64201: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.64206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.64210: variable 'omit' from source: magic vars 34052 1727204443.64556: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.64569: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.64741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204443.65018: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204443.65068: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204443.65532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204443.65557: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204443.65745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204443.65749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204443.65754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204443.65757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204443.65836: variable '__network_is_ostree' from source: set_fact 34052 1727204443.65845: Evaluated conditional (not __network_is_ostree is defined): False 34052 1727204443.65848: when evaluation is False, skipping this task 34052 1727204443.65851: _execute() done 34052 1727204443.65854: dumping result to json 34052 1727204443.65857: done dumping result, returning 34052 1727204443.65971: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-66a4-e2a3-000000000515] 34052 1727204443.65975: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000515 34052 1727204443.66047: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000515 34052 1727204443.66050: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34052 1727204443.66106: no more pending results, returning what we have 34052 1727204443.66109: results queue empty 34052 1727204443.66110: checking for any_errors_fatal 34052 1727204443.66115: done checking for any_errors_fatal 34052 1727204443.66115: checking for max_fail_percentage 34052 1727204443.66117: done checking for max_fail_percentage 34052 1727204443.66118: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.66119: done checking to see if all hosts have failed 34052 1727204443.66120: getting the remaining hosts for this loop 34052 1727204443.66122: done getting the remaining hosts for this loop 34052 1727204443.66126: getting the next task for host managed-node1 34052 1727204443.66133: done getting next task for host managed-node1 34052 1727204443.66137: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34052 1727204443.66141: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204443.66160: getting variables 34052 1727204443.66161: in VariableManager get_vars() 34052 1727204443.66202: Calling all_inventory to load vars for managed-node1 34052 1727204443.66204: Calling groups_inventory to load vars for managed-node1 34052 1727204443.66206: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.66216: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.66218: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.66221: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.68039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.70256: done with get_vars() 34052 1727204443.70298: done getting variables 34052 1727204443.70385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.074) 0:00:30.021 ***** 34052 1727204443.70429: entering _queue_task() for managed-node1/set_fact 34052 1727204443.70861: worker is 1 (out of 1 available) 34052 1727204443.70880: exiting _queue_task() for managed-node1/set_fact 34052 1727204443.70901: done queuing things up, now waiting for results queue to drain 34052 1727204443.70903: waiting for pending results... 34052 1727204443.71468: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34052 1727204443.71474: in run() - task 127b8e07-fff9-66a4-e2a3-000000000516 34052 1727204443.71501: variable 'ansible_search_path' from source: unknown 34052 1727204443.71509: variable 'ansible_search_path' from source: unknown 34052 1727204443.71570: calling self._execute() 34052 1727204443.71704: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.71718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.71736: variable 'omit' from source: magic vars 34052 1727204443.72258: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.72320: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.72511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204443.72921: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204443.72981: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204443.73088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204443.73172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204443.73263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204443.73300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204443.73346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204443.73383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204443.73550: variable '__network_is_ostree' from source: set_fact 34052 1727204443.73556: Evaluated conditional (not __network_is_ostree is defined): False 34052 1727204443.73558: when evaluation is False, skipping this task 34052 1727204443.73561: _execute() done 34052 1727204443.73563: dumping result to json 34052 1727204443.73569: done dumping result, returning 34052 1727204443.73575: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-66a4-e2a3-000000000516] 34052 1727204443.73578: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000516 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34052 1727204443.73739: no more pending results, returning what we have 34052 1727204443.73743: results queue empty 34052 1727204443.73744: checking for any_errors_fatal 34052 1727204443.73750: done checking for any_errors_fatal 34052 1727204443.73751: checking for max_fail_percentage 34052 1727204443.73753: done checking for max_fail_percentage 34052 1727204443.73754: checking to see if all hosts have failed and the running result is not ok 34052 1727204443.73755: done checking to see if all hosts have failed 34052 1727204443.73756: getting the remaining hosts for this loop 34052 1727204443.73758: done getting the remaining hosts for this loop 34052 1727204443.73762: getting the next task for host managed-node1 34052 1727204443.73977: done getting next task for host managed-node1 34052 1727204443.73982: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34052 1727204443.73986: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204443.74006: getting variables 34052 1727204443.74009: in VariableManager get_vars() 34052 1727204443.74057: Calling all_inventory to load vars for managed-node1 34052 1727204443.74060: Calling groups_inventory to load vars for managed-node1 34052 1727204443.74062: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204443.74079: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000516 34052 1727204443.74083: WORKER PROCESS EXITING 34052 1727204443.74095: Calling all_plugins_play to load vars for managed-node1 34052 1727204443.74099: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204443.74103: Calling groups_plugins_play to load vars for managed-node1 34052 1727204443.76347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204443.78019: done with get_vars() 34052 1727204443.78051: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.077) 0:00:30.098 ***** 34052 1727204443.78140: entering _queue_task() for managed-node1/service_facts 34052 1727204443.78440: worker is 1 (out of 1 available) 34052 1727204443.78455: exiting _queue_task() for managed-node1/service_facts 34052 1727204443.78469: done queuing things up, now waiting for results queue to drain 34052 1727204443.78471: waiting for pending results... 34052 1727204443.78696: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 34052 1727204443.78805: in run() - task 127b8e07-fff9-66a4-e2a3-000000000518 34052 1727204443.78822: variable 'ansible_search_path' from source: unknown 34052 1727204443.78832: variable 'ansible_search_path' from source: unknown 34052 1727204443.79073: calling self._execute() 34052 1727204443.79077: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.79081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.79084: variable 'omit' from source: magic vars 34052 1727204443.79338: variable 'ansible_distribution_major_version' from source: facts 34052 1727204443.79348: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204443.79354: variable 'omit' from source: magic vars 34052 1727204443.79421: variable 'omit' from source: magic vars 34052 1727204443.79456: variable 'omit' from source: magic vars 34052 1727204443.79492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204443.79529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204443.79547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204443.79563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204443.79576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204443.79602: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204443.79605: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.79608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.79689: Set connection var ansible_connection to ssh 34052 1727204443.79696: Set connection var ansible_timeout to 10 34052 1727204443.79703: Set connection var ansible_pipelining to False 34052 1727204443.79706: Set connection var ansible_shell_type to sh 34052 1727204443.79713: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204443.79720: Set connection var ansible_shell_executable to /bin/sh 34052 1727204443.79745: variable 'ansible_shell_executable' from source: unknown 34052 1727204443.79748: variable 'ansible_connection' from source: unknown 34052 1727204443.79753: variable 'ansible_module_compression' from source: unknown 34052 1727204443.79756: variable 'ansible_shell_type' from source: unknown 34052 1727204443.79758: variable 'ansible_shell_executable' from source: unknown 34052 1727204443.79761: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204443.79763: variable 'ansible_pipelining' from source: unknown 34052 1727204443.79767: variable 'ansible_timeout' from source: unknown 34052 1727204443.79769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204443.80172: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204443.80178: variable 'omit' from source: magic vars 34052 1727204443.80181: starting attempt loop 34052 1727204443.80183: running the handler 34052 1727204443.80185: _low_level_execute_command(): starting 34052 1727204443.80187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204443.80695: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.80714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.80763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204443.80780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.80853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.82653: stdout chunk (state=3): >>>/root <<< 34052 1727204443.82758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204443.82826: stderr chunk (state=3): >>><<< 34052 1727204443.82834: stdout chunk (state=3): >>><<< 34052 1727204443.82857: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204443.82877: _low_level_execute_command(): starting 34052 1727204443.82884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053 `" && echo ansible-tmp-1727204443.828621-36053-88277630001053="` echo /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053 `" ) && sleep 0' 34052 1727204443.83398: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.83402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.83405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.83416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.83470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204443.83475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204443.83481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.83538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.85645: stdout chunk (state=3): >>>ansible-tmp-1727204443.828621-36053-88277630001053=/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053 <<< 34052 1727204443.85755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204443.85821: stderr chunk (state=3): >>><<< 34052 1727204443.85825: stdout chunk (state=3): >>><<< 34052 1727204443.85842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204443.828621-36053-88277630001053=/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204443.85888: variable 'ansible_module_compression' from source: unknown 34052 1727204443.85936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 34052 1727204443.85971: variable 'ansible_facts' from source: unknown 34052 1727204443.86034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py 34052 1727204443.86154: Sending initial data 34052 1727204443.86158: Sent initial data (160 bytes) 34052 1727204443.86637: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.86645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204443.86676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.86679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.86682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.86770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.86813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.88505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204443.88553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204443.88604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp5cipov5r /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py <<< 34052 1727204443.88608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py" <<< 34052 1727204443.88656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp5cipov5r" to remote "/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py" <<< 34052 1727204443.88659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py" <<< 34052 1727204443.89236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204443.89321: stderr chunk (state=3): >>><<< 34052 1727204443.89325: stdout chunk (state=3): >>><<< 34052 1727204443.89346: done transferring module to remote 34052 1727204443.89357: _low_level_execute_command(): starting 34052 1727204443.89362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/ /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py && sleep 0' 34052 1727204443.89842: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.89846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.89883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204443.89886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.89942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204443.89949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204443.89951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.89999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204443.91978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204443.92044: stderr chunk (state=3): >>><<< 34052 1727204443.92048: stdout chunk (state=3): >>><<< 34052 1727204443.92063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204443.92069: _low_level_execute_command(): starting 34052 1727204443.92072: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/AnsiballZ_service_facts.py && sleep 0' 34052 1727204443.92572: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204443.92581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.92584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204443.92587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204443.92643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204443.92646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204443.92649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204443.92714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.26147: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 34052 1727204446.26170: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "sour<<< 34052 1727204446.26185: stdout chunk (state=3): >>>ce": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "syst<<< 34052 1727204446.26191: stdout chunk (state=3): >>>emd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymout<<< 34052 1727204446.26194: stdout chunk (state=3): >>>h-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34052 1727204446.27897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204446.27964: stderr chunk (state=3): >>><<< 34052 1727204446.27970: stdout chunk (state=3): >>><<< 34052 1727204446.28005: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204446.28551: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204446.28559: _low_level_execute_command(): starting 34052 1727204446.28565: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204443.828621-36053-88277630001053/ > /dev/null 2>&1 && sleep 0' 34052 1727204446.29079: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204446.29083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.29086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204446.29088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204446.29090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.29147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204446.29156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204446.29158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.29206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.31231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204446.31296: stderr chunk (state=3): >>><<< 34052 1727204446.31300: stdout chunk (state=3): >>><<< 34052 1727204446.31314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204446.31321: handler run complete 34052 1727204446.31469: variable 'ansible_facts' from source: unknown 34052 1727204446.31605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204446.31949: variable 'ansible_facts' from source: unknown 34052 1727204446.32053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204446.32204: attempt loop complete, returning result 34052 1727204446.32210: _execute() done 34052 1727204446.32213: dumping result to json 34052 1727204446.32258: done dumping result, returning 34052 1727204446.32268: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-66a4-e2a3-000000000518] 34052 1727204446.32271: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000518 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204446.33015: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000518 34052 1727204446.33018: WORKER PROCESS EXITING 34052 1727204446.33030: no more pending results, returning what we have 34052 1727204446.33032: results queue empty 34052 1727204446.33032: checking for any_errors_fatal 34052 1727204446.33036: done checking for any_errors_fatal 34052 1727204446.33037: checking for max_fail_percentage 34052 1727204446.33038: done checking for max_fail_percentage 34052 1727204446.33038: checking to see if all hosts have failed and the running result is not ok 34052 1727204446.33039: done checking to see if all hosts have failed 34052 1727204446.33040: getting the remaining hosts for this loop 34052 1727204446.33040: done getting the remaining hosts for this loop 34052 1727204446.33043: getting the next task for host managed-node1 34052 1727204446.33047: done getting next task for host managed-node1 34052 1727204446.33050: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34052 1727204446.33053: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204446.33060: getting variables 34052 1727204446.33061: in VariableManager get_vars() 34052 1727204446.33089: Calling all_inventory to load vars for managed-node1 34052 1727204446.33091: Calling groups_inventory to load vars for managed-node1 34052 1727204446.33092: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204446.33105: Calling all_plugins_play to load vars for managed-node1 34052 1727204446.33107: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204446.33109: Calling groups_plugins_play to load vars for managed-node1 34052 1727204446.34137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204446.35344: done with get_vars() 34052 1727204446.35372: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:46 -0400 (0:00:02.573) 0:00:32.671 ***** 34052 1727204446.35460: entering _queue_task() for managed-node1/package_facts 34052 1727204446.35757: worker is 1 (out of 1 available) 34052 1727204446.35774: exiting _queue_task() for managed-node1/package_facts 34052 1727204446.35787: done queuing things up, now waiting for results queue to drain 34052 1727204446.35789: waiting for pending results... 34052 1727204446.35982: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 34052 1727204446.36105: in run() - task 127b8e07-fff9-66a4-e2a3-000000000519 34052 1727204446.36120: variable 'ansible_search_path' from source: unknown 34052 1727204446.36124: variable 'ansible_search_path' from source: unknown 34052 1727204446.36158: calling self._execute() 34052 1727204446.36242: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204446.36246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204446.36258: variable 'omit' from source: magic vars 34052 1727204446.36557: variable 'ansible_distribution_major_version' from source: facts 34052 1727204446.36569: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204446.36575: variable 'omit' from source: magic vars 34052 1727204446.36630: variable 'omit' from source: magic vars 34052 1727204446.36656: variable 'omit' from source: magic vars 34052 1727204446.36697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204446.36729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204446.36745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204446.36760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204446.36773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204446.36805: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204446.36809: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204446.36812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204446.36882: Set connection var ansible_connection to ssh 34052 1727204446.36889: Set connection var ansible_timeout to 10 34052 1727204446.36895: Set connection var ansible_pipelining to False 34052 1727204446.36898: Set connection var ansible_shell_type to sh 34052 1727204446.36910: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204446.36913: Set connection var ansible_shell_executable to /bin/sh 34052 1727204446.36935: variable 'ansible_shell_executable' from source: unknown 34052 1727204446.36938: variable 'ansible_connection' from source: unknown 34052 1727204446.36941: variable 'ansible_module_compression' from source: unknown 34052 1727204446.36943: variable 'ansible_shell_type' from source: unknown 34052 1727204446.36946: variable 'ansible_shell_executable' from source: unknown 34052 1727204446.36948: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204446.36952: variable 'ansible_pipelining' from source: unknown 34052 1727204446.36955: variable 'ansible_timeout' from source: unknown 34052 1727204446.36959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204446.37119: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204446.37132: variable 'omit' from source: magic vars 34052 1727204446.37137: starting attempt loop 34052 1727204446.37139: running the handler 34052 1727204446.37151: _low_level_execute_command(): starting 34052 1727204446.37158: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204446.37727: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204446.37733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204446.37736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.37792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204446.37800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204446.37804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.37857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.39659: stdout chunk (state=3): >>>/root <<< 34052 1727204446.39768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204446.39827: stderr chunk (state=3): >>><<< 34052 1727204446.39831: stdout chunk (state=3): >>><<< 34052 1727204446.39856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204446.39873: _low_level_execute_command(): starting 34052 1727204446.39876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808 `" && echo ansible-tmp-1727204446.398569-36311-106200502884808="` echo /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808 `" ) && sleep 0' 34052 1727204446.40372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204446.40395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.40398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.40457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204446.40460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204446.40463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.40520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.42635: stdout chunk (state=3): >>>ansible-tmp-1727204446.398569-36311-106200502884808=/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808 <<< 34052 1727204446.42880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204446.42884: stdout chunk (state=3): >>><<< 34052 1727204446.42887: stderr chunk (state=3): >>><<< 34052 1727204446.42890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204446.398569-36311-106200502884808=/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204446.42964: variable 'ansible_module_compression' from source: unknown 34052 1727204446.42989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 34052 1727204446.43043: variable 'ansible_facts' from source: unknown 34052 1727204446.43173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py 34052 1727204446.43303: Sending initial data 34052 1727204446.43307: Sent initial data (161 bytes) 34052 1727204446.43804: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204446.43808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.43811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204446.43813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.43873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204446.43877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.43942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.45676: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204446.45706: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204446.45757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204446.45811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpd19a327q /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py <<< 34052 1727204446.45814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py" <<< 34052 1727204446.45859: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpd19a327q" to remote "/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py" <<< 34052 1727204446.47746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204446.47751: stdout chunk (state=3): >>><<< 34052 1727204446.47753: stderr chunk (state=3): >>><<< 34052 1727204446.47756: done transferring module to remote 34052 1727204446.47758: _low_level_execute_command(): starting 34052 1727204446.47760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/ /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py && sleep 0' 34052 1727204446.48396: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204446.48426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204446.48442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204446.48535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204446.48575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204446.48594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204446.48617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.48724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204446.50806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204446.50827: stderr chunk (state=3): >>><<< 34052 1727204446.50840: stdout chunk (state=3): >>><<< 34052 1727204446.50861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204446.50877: _low_level_execute_command(): starting 34052 1727204446.50892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/AnsiballZ_package_facts.py && sleep 0' 34052 1727204446.51700: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204446.51752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204446.51801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204447.15945: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 34052 1727204447.16107: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 34052 1727204447.16153: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34052 1727204447.18023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204447.18132: stderr chunk (state=3): >>>Shared connection to 10.31.8.176 closed. <<< 34052 1727204447.18152: stderr chunk (state=3): >>><<< 34052 1727204447.18170: stdout chunk (state=3): >>><<< 34052 1727204447.18376: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204447.21506: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204447.21544: _low_level_execute_command(): starting 34052 1727204447.21568: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204446.398569-36311-106200502884808/ > /dev/null 2>&1 && sleep 0' 34052 1727204447.22269: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204447.22289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204447.22302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204447.22331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204447.22348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204447.22445: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204447.22475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204447.22573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204447.24687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204447.24715: stdout chunk (state=3): >>><<< 34052 1727204447.24718: stderr chunk (state=3): >>><<< 34052 1727204447.24872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204447.24876: handler run complete 34052 1727204447.26243: variable 'ansible_facts' from source: unknown 34052 1727204447.26857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.29719: variable 'ansible_facts' from source: unknown 34052 1727204447.30938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.31949: attempt loop complete, returning result 34052 1727204447.31984: _execute() done 34052 1727204447.31993: dumping result to json 34052 1727204447.32287: done dumping result, returning 34052 1727204447.32305: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-66a4-e2a3-000000000519] 34052 1727204447.32314: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000519 34052 1727204447.38098: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000519 34052 1727204447.38102: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204447.38271: no more pending results, returning what we have 34052 1727204447.38274: results queue empty 34052 1727204447.38275: checking for any_errors_fatal 34052 1727204447.38281: done checking for any_errors_fatal 34052 1727204447.38282: checking for max_fail_percentage 34052 1727204447.38284: done checking for max_fail_percentage 34052 1727204447.38285: checking to see if all hosts have failed and the running result is not ok 34052 1727204447.38286: done checking to see if all hosts have failed 34052 1727204447.38286: getting the remaining hosts for this loop 34052 1727204447.38288: done getting the remaining hosts for this loop 34052 1727204447.38292: getting the next task for host managed-node1 34052 1727204447.38299: done getting next task for host managed-node1 34052 1727204447.38303: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34052 1727204447.38305: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204447.38318: getting variables 34052 1727204447.38319: in VariableManager get_vars() 34052 1727204447.38358: Calling all_inventory to load vars for managed-node1 34052 1727204447.38361: Calling groups_inventory to load vars for managed-node1 34052 1727204447.38364: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204447.38584: Calling all_plugins_play to load vars for managed-node1 34052 1727204447.38588: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204447.38592: Calling groups_plugins_play to load vars for managed-node1 34052 1727204447.42184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.46929: done with get_vars() 34052 1727204447.46974: done getting variables 34052 1727204447.47043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:00:47 -0400 (0:00:01.116) 0:00:33.788 ***** 34052 1727204447.47088: entering _queue_task() for managed-node1/debug 34052 1727204447.47489: worker is 1 (out of 1 available) 34052 1727204447.47505: exiting _queue_task() for managed-node1/debug 34052 1727204447.47520: done queuing things up, now waiting for results queue to drain 34052 1727204447.47521: waiting for pending results... 34052 1727204447.47857: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 34052 1727204447.48015: in run() - task 127b8e07-fff9-66a4-e2a3-00000000006f 34052 1727204447.48045: variable 'ansible_search_path' from source: unknown 34052 1727204447.48072: variable 'ansible_search_path' from source: unknown 34052 1727204447.48107: calling self._execute() 34052 1727204447.48220: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.48272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.48276: variable 'omit' from source: magic vars 34052 1727204447.49049: variable 'ansible_distribution_major_version' from source: facts 34052 1727204447.49053: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204447.49056: variable 'omit' from source: magic vars 34052 1727204447.49241: variable 'omit' from source: magic vars 34052 1727204447.49534: variable 'network_provider' from source: set_fact 34052 1727204447.49563: variable 'omit' from source: magic vars 34052 1727204447.49685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204447.49785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204447.50071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204447.50076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204447.50079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204447.50082: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204447.50085: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.50088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.50263: Set connection var ansible_connection to ssh 34052 1727204447.50472: Set connection var ansible_timeout to 10 34052 1727204447.50476: Set connection var ansible_pipelining to False 34052 1727204447.50478: Set connection var ansible_shell_type to sh 34052 1727204447.50480: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204447.50482: Set connection var ansible_shell_executable to /bin/sh 34052 1727204447.50484: variable 'ansible_shell_executable' from source: unknown 34052 1727204447.50487: variable 'ansible_connection' from source: unknown 34052 1727204447.50489: variable 'ansible_module_compression' from source: unknown 34052 1727204447.50491: variable 'ansible_shell_type' from source: unknown 34052 1727204447.50494: variable 'ansible_shell_executable' from source: unknown 34052 1727204447.50496: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.50498: variable 'ansible_pipelining' from source: unknown 34052 1727204447.50499: variable 'ansible_timeout' from source: unknown 34052 1727204447.50722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.51052: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204447.51056: variable 'omit' from source: magic vars 34052 1727204447.51058: starting attempt loop 34052 1727204447.51060: running the handler 34052 1727204447.51063: handler run complete 34052 1727204447.51174: attempt loop complete, returning result 34052 1727204447.51182: _execute() done 34052 1727204447.51188: dumping result to json 34052 1727204447.51194: done dumping result, returning 34052 1727204447.51205: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-66a4-e2a3-00000000006f] 34052 1727204447.51212: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000006f 34052 1727204447.51574: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000006f 34052 1727204447.51578: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 34052 1727204447.51653: no more pending results, returning what we have 34052 1727204447.51656: results queue empty 34052 1727204447.51657: checking for any_errors_fatal 34052 1727204447.51668: done checking for any_errors_fatal 34052 1727204447.51669: checking for max_fail_percentage 34052 1727204447.51671: done checking for max_fail_percentage 34052 1727204447.51672: checking to see if all hosts have failed and the running result is not ok 34052 1727204447.51672: done checking to see if all hosts have failed 34052 1727204447.51673: getting the remaining hosts for this loop 34052 1727204447.51675: done getting the remaining hosts for this loop 34052 1727204447.51680: getting the next task for host managed-node1 34052 1727204447.51686: done getting next task for host managed-node1 34052 1727204447.51691: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34052 1727204447.51695: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204447.51707: getting variables 34052 1727204447.51709: in VariableManager get_vars() 34052 1727204447.51758: Calling all_inventory to load vars for managed-node1 34052 1727204447.51762: Calling groups_inventory to load vars for managed-node1 34052 1727204447.51764: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204447.51991: Calling all_plugins_play to load vars for managed-node1 34052 1727204447.51995: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204447.51999: Calling groups_plugins_play to load vars for managed-node1 34052 1727204447.56188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.60748: done with get_vars() 34052 1727204447.60999: done getting variables 34052 1727204447.61073: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.140) 0:00:33.928 ***** 34052 1727204447.61118: entering _queue_task() for managed-node1/fail 34052 1727204447.62303: worker is 1 (out of 1 available) 34052 1727204447.62317: exiting _queue_task() for managed-node1/fail 34052 1727204447.62332: done queuing things up, now waiting for results queue to drain 34052 1727204447.62334: waiting for pending results... 34052 1727204447.62606: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34052 1727204447.63075: in run() - task 127b8e07-fff9-66a4-e2a3-000000000070 34052 1727204447.63080: variable 'ansible_search_path' from source: unknown 34052 1727204447.63082: variable 'ansible_search_path' from source: unknown 34052 1727204447.63104: calling self._execute() 34052 1727204447.63679: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.63684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.63687: variable 'omit' from source: magic vars 34052 1727204447.64443: variable 'ansible_distribution_major_version' from source: facts 34052 1727204447.64469: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204447.64748: variable 'network_state' from source: role '' defaults 34052 1727204447.64776: Evaluated conditional (network_state != {}): False 34052 1727204447.64880: when evaluation is False, skipping this task 34052 1727204447.64889: _execute() done 34052 1727204447.64896: dumping result to json 34052 1727204447.64904: done dumping result, returning 34052 1727204447.64917: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-66a4-e2a3-000000000070] 34052 1727204447.64929: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000070 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204447.65109: no more pending results, returning what we have 34052 1727204447.65114: results queue empty 34052 1727204447.65114: checking for any_errors_fatal 34052 1727204447.65123: done checking for any_errors_fatal 34052 1727204447.65124: checking for max_fail_percentage 34052 1727204447.65129: done checking for max_fail_percentage 34052 1727204447.65130: checking to see if all hosts have failed and the running result is not ok 34052 1727204447.65131: done checking to see if all hosts have failed 34052 1727204447.65131: getting the remaining hosts for this loop 34052 1727204447.65133: done getting the remaining hosts for this loop 34052 1727204447.65138: getting the next task for host managed-node1 34052 1727204447.65146: done getting next task for host managed-node1 34052 1727204447.65151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34052 1727204447.65155: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204447.65180: getting variables 34052 1727204447.65182: in VariableManager get_vars() 34052 1727204447.65235: Calling all_inventory to load vars for managed-node1 34052 1727204447.65238: Calling groups_inventory to load vars for managed-node1 34052 1727204447.65241: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204447.65256: Calling all_plugins_play to load vars for managed-node1 34052 1727204447.65260: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204447.65263: Calling groups_plugins_play to load vars for managed-node1 34052 1727204447.66478: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000070 34052 1727204447.66483: WORKER PROCESS EXITING 34052 1727204447.69379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.72113: done with get_vars() 34052 1727204447.72169: done getting variables 34052 1727204447.72243: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.111) 0:00:34.040 ***** 34052 1727204447.72293: entering _queue_task() for managed-node1/fail 34052 1727204447.72739: worker is 1 (out of 1 available) 34052 1727204447.72755: exiting _queue_task() for managed-node1/fail 34052 1727204447.72975: done queuing things up, now waiting for results queue to drain 34052 1727204447.72978: waiting for pending results... 34052 1727204447.73108: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34052 1727204447.73280: in run() - task 127b8e07-fff9-66a4-e2a3-000000000071 34052 1727204447.73311: variable 'ansible_search_path' from source: unknown 34052 1727204447.73320: variable 'ansible_search_path' from source: unknown 34052 1727204447.73377: calling self._execute() 34052 1727204447.73504: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.73523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.73548: variable 'omit' from source: magic vars 34052 1727204447.74023: variable 'ansible_distribution_major_version' from source: facts 34052 1727204447.74046: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204447.74213: variable 'network_state' from source: role '' defaults 34052 1727204447.74236: Evaluated conditional (network_state != {}): False 34052 1727204447.74245: when evaluation is False, skipping this task 34052 1727204447.74254: _execute() done 34052 1727204447.74262: dumping result to json 34052 1727204447.74275: done dumping result, returning 34052 1727204447.74321: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-66a4-e2a3-000000000071] 34052 1727204447.74431: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000071 34052 1727204447.74790: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000071 34052 1727204447.74793: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204447.74857: no more pending results, returning what we have 34052 1727204447.74861: results queue empty 34052 1727204447.74861: checking for any_errors_fatal 34052 1727204447.74872: done checking for any_errors_fatal 34052 1727204447.74873: checking for max_fail_percentage 34052 1727204447.74875: done checking for max_fail_percentage 34052 1727204447.74876: checking to see if all hosts have failed and the running result is not ok 34052 1727204447.74877: done checking to see if all hosts have failed 34052 1727204447.74878: getting the remaining hosts for this loop 34052 1727204447.74880: done getting the remaining hosts for this loop 34052 1727204447.74889: getting the next task for host managed-node1 34052 1727204447.74898: done getting next task for host managed-node1 34052 1727204447.74902: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34052 1727204447.74907: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204447.74933: getting variables 34052 1727204447.74936: in VariableManager get_vars() 34052 1727204447.75201: Calling all_inventory to load vars for managed-node1 34052 1727204447.75205: Calling groups_inventory to load vars for managed-node1 34052 1727204447.75209: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204447.75221: Calling all_plugins_play to load vars for managed-node1 34052 1727204447.75224: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204447.75232: Calling groups_plugins_play to load vars for managed-node1 34052 1727204447.79663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204447.84339: done with get_vars() 34052 1727204447.84384: done getting variables 34052 1727204447.84458: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.122) 0:00:34.162 ***** 34052 1727204447.84505: entering _queue_task() for managed-node1/fail 34052 1727204447.85009: worker is 1 (out of 1 available) 34052 1727204447.85023: exiting _queue_task() for managed-node1/fail 34052 1727204447.85037: done queuing things up, now waiting for results queue to drain 34052 1727204447.85039: waiting for pending results... 34052 1727204447.85304: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34052 1727204447.85576: in run() - task 127b8e07-fff9-66a4-e2a3-000000000072 34052 1727204447.85580: variable 'ansible_search_path' from source: unknown 34052 1727204447.85583: variable 'ansible_search_path' from source: unknown 34052 1727204447.85587: calling self._execute() 34052 1727204447.85669: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204447.85687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204447.85702: variable 'omit' from source: magic vars 34052 1727204447.86123: variable 'ansible_distribution_major_version' from source: facts 34052 1727204447.86143: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204447.86352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204447.90183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204447.90465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204447.90496: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204447.90543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204447.90601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204447.90900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204447.90941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204447.91042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204447.91093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204447.91246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204447.91672: variable 'ansible_distribution_major_version' from source: facts 34052 1727204447.91675: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34052 1727204447.91756: variable 'ansible_distribution' from source: facts 34052 1727204447.91767: variable '__network_rh_distros' from source: role '' defaults 34052 1727204447.91786: Evaluated conditional (ansible_distribution in __network_rh_distros): False 34052 1727204447.91896: when evaluation is False, skipping this task 34052 1727204447.91904: _execute() done 34052 1727204447.91911: dumping result to json 34052 1727204447.91919: done dumping result, returning 34052 1727204447.91934: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-66a4-e2a3-000000000072] 34052 1727204447.91943: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000072 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 34052 1727204447.92105: no more pending results, returning what we have 34052 1727204447.92109: results queue empty 34052 1727204447.92110: checking for any_errors_fatal 34052 1727204447.92118: done checking for any_errors_fatal 34052 1727204447.92120: checking for max_fail_percentage 34052 1727204447.92122: done checking for max_fail_percentage 34052 1727204447.92123: checking to see if all hosts have failed and the running result is not ok 34052 1727204447.92124: done checking to see if all hosts have failed 34052 1727204447.92125: getting the remaining hosts for this loop 34052 1727204447.92130: done getting the remaining hosts for this loop 34052 1727204447.92135: getting the next task for host managed-node1 34052 1727204447.92142: done getting next task for host managed-node1 34052 1727204447.92146: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34052 1727204447.92150: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204447.92172: getting variables 34052 1727204447.92175: in VariableManager get_vars() 34052 1727204447.92221: Calling all_inventory to load vars for managed-node1 34052 1727204447.92224: Calling groups_inventory to load vars for managed-node1 34052 1727204447.92229: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204447.92241: Calling all_plugins_play to load vars for managed-node1 34052 1727204447.92244: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204447.92248: Calling groups_plugins_play to load vars for managed-node1 34052 1727204447.93379: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000072 34052 1727204447.93384: WORKER PROCESS EXITING 34052 1727204447.96732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.00799: done with get_vars() 34052 1727204448.00843: done getting variables 34052 1727204448.00917: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.164) 0:00:34.326 ***** 34052 1727204448.00957: entering _queue_task() for managed-node1/dnf 34052 1727204448.01363: worker is 1 (out of 1 available) 34052 1727204448.01580: exiting _queue_task() for managed-node1/dnf 34052 1727204448.01593: done queuing things up, now waiting for results queue to drain 34052 1727204448.01594: waiting for pending results... 34052 1727204448.01732: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34052 1727204448.01892: in run() - task 127b8e07-fff9-66a4-e2a3-000000000073 34052 1727204448.01916: variable 'ansible_search_path' from source: unknown 34052 1727204448.01932: variable 'ansible_search_path' from source: unknown 34052 1727204448.01981: calling self._execute() 34052 1727204448.02097: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.02109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.02125: variable 'omit' from source: magic vars 34052 1727204448.02584: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.02606: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.02838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.06211: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.06300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.06348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.06396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.06435: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.06550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.06589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.06623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.06730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.06733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.07010: variable 'ansible_distribution' from source: facts 34052 1727204448.07014: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.07017: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34052 1727204448.07104: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204448.07263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.07296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.07325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.07380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.07401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.07458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.07491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.07552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.07574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.07593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.07648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.07684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.07715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.07768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.07872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.07968: variable 'network_connections' from source: task vars 34052 1727204448.07991: variable 'interface' from source: play vars 34052 1727204448.08074: variable 'interface' from source: play vars 34052 1727204448.08170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204448.08373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204448.08422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204448.08462: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204448.08499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204448.08561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204448.08594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204448.08643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.08678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204448.08869: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204448.09047: variable 'network_connections' from source: task vars 34052 1727204448.09059: variable 'interface' from source: play vars 34052 1727204448.09139: variable 'interface' from source: play vars 34052 1727204448.09172: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204448.09181: when evaluation is False, skipping this task 34052 1727204448.09188: _execute() done 34052 1727204448.09195: dumping result to json 34052 1727204448.09210: done dumping result, returning 34052 1727204448.09223: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-000000000073] 34052 1727204448.09234: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000073 34052 1727204448.09517: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000073 34052 1727204448.09521: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204448.09586: no more pending results, returning what we have 34052 1727204448.09590: results queue empty 34052 1727204448.09591: checking for any_errors_fatal 34052 1727204448.09599: done checking for any_errors_fatal 34052 1727204448.09600: checking for max_fail_percentage 34052 1727204448.09602: done checking for max_fail_percentage 34052 1727204448.09603: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.09604: done checking to see if all hosts have failed 34052 1727204448.09605: getting the remaining hosts for this loop 34052 1727204448.09607: done getting the remaining hosts for this loop 34052 1727204448.09612: getting the next task for host managed-node1 34052 1727204448.09620: done getting next task for host managed-node1 34052 1727204448.09628: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34052 1727204448.09631: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.09652: getting variables 34052 1727204448.09654: in VariableManager get_vars() 34052 1727204448.09705: Calling all_inventory to load vars for managed-node1 34052 1727204448.09708: Calling groups_inventory to load vars for managed-node1 34052 1727204448.09711: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.09724: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.09730: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.09734: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.11733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.13906: done with get_vars() 34052 1727204448.13950: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34052 1727204448.14041: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.131) 0:00:34.457 ***** 34052 1727204448.14078: entering _queue_task() for managed-node1/yum 34052 1727204448.14706: worker is 1 (out of 1 available) 34052 1727204448.14719: exiting _queue_task() for managed-node1/yum 34052 1727204448.14734: done queuing things up, now waiting for results queue to drain 34052 1727204448.14736: waiting for pending results... 34052 1727204448.14836: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34052 1727204448.15073: in run() - task 127b8e07-fff9-66a4-e2a3-000000000074 34052 1727204448.15078: variable 'ansible_search_path' from source: unknown 34052 1727204448.15080: variable 'ansible_search_path' from source: unknown 34052 1727204448.15083: calling self._execute() 34052 1727204448.15201: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.15216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.15237: variable 'omit' from source: magic vars 34052 1727204448.15675: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.15694: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.15895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.19996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.20212: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.20238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.20283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.20572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.20605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.20646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.20713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.20841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.20918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.21144: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.21248: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34052 1727204448.21257: when evaluation is False, skipping this task 34052 1727204448.21264: _execute() done 34052 1727204448.21274: dumping result to json 34052 1727204448.21281: done dumping result, returning 34052 1727204448.21293: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-000000000074] 34052 1727204448.21301: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000074 34052 1727204448.21635: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000074 34052 1727204448.21639: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34052 1727204448.21709: no more pending results, returning what we have 34052 1727204448.21713: results queue empty 34052 1727204448.21714: checking for any_errors_fatal 34052 1727204448.21720: done checking for any_errors_fatal 34052 1727204448.21721: checking for max_fail_percentage 34052 1727204448.21722: done checking for max_fail_percentage 34052 1727204448.21723: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.21724: done checking to see if all hosts have failed 34052 1727204448.21725: getting the remaining hosts for this loop 34052 1727204448.21729: done getting the remaining hosts for this loop 34052 1727204448.21734: getting the next task for host managed-node1 34052 1727204448.21741: done getting next task for host managed-node1 34052 1727204448.21745: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34052 1727204448.21748: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.21771: getting variables 34052 1727204448.21774: in VariableManager get_vars() 34052 1727204448.21819: Calling all_inventory to load vars for managed-node1 34052 1727204448.21822: Calling groups_inventory to load vars for managed-node1 34052 1727204448.21824: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.21839: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.21843: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.21846: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.26379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.29646: done with get_vars() 34052 1727204448.29692: done getting variables 34052 1727204448.29771: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.157) 0:00:34.615 ***** 34052 1727204448.29812: entering _queue_task() for managed-node1/fail 34052 1727204448.30229: worker is 1 (out of 1 available) 34052 1727204448.30245: exiting _queue_task() for managed-node1/fail 34052 1727204448.30259: done queuing things up, now waiting for results queue to drain 34052 1727204448.30261: waiting for pending results... 34052 1727204448.31273: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34052 1727204448.31600: in run() - task 127b8e07-fff9-66a4-e2a3-000000000075 34052 1727204448.31672: variable 'ansible_search_path' from source: unknown 34052 1727204448.31676: variable 'ansible_search_path' from source: unknown 34052 1727204448.31679: calling self._execute() 34052 1727204448.31876: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.31881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.31903: variable 'omit' from source: magic vars 34052 1727204448.32790: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.32886: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.32963: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204448.33202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.35400: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.35485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.35538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.35670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.35674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.35737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.35798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.35835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.35889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.35912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.35972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.36002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.36034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.36082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.36100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.36158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.36194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.36270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.36275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.36294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.36491: variable 'network_connections' from source: task vars 34052 1727204448.36517: variable 'interface' from source: play vars 34052 1727204448.36604: variable 'interface' from source: play vars 34052 1727204448.36700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204448.37071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204448.37075: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204448.37078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204448.37080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204448.37099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204448.37136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204448.37173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.37216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204448.37374: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204448.37664: variable 'network_connections' from source: task vars 34052 1727204448.37679: variable 'interface' from source: play vars 34052 1727204448.37952: variable 'interface' from source: play vars 34052 1727204448.37991: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204448.38001: when evaluation is False, skipping this task 34052 1727204448.38010: _execute() done 34052 1727204448.38018: dumping result to json 34052 1727204448.38029: done dumping result, returning 34052 1727204448.38083: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-000000000075] 34052 1727204448.38093: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000075 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204448.38294: no more pending results, returning what we have 34052 1727204448.38297: results queue empty 34052 1727204448.38298: checking for any_errors_fatal 34052 1727204448.38303: done checking for any_errors_fatal 34052 1727204448.38304: checking for max_fail_percentage 34052 1727204448.38306: done checking for max_fail_percentage 34052 1727204448.38307: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.38308: done checking to see if all hosts have failed 34052 1727204448.38308: getting the remaining hosts for this loop 34052 1727204448.38370: done getting the remaining hosts for this loop 34052 1727204448.38376: getting the next task for host managed-node1 34052 1727204448.38384: done getting next task for host managed-node1 34052 1727204448.38389: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34052 1727204448.38392: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.38414: getting variables 34052 1727204448.38416: in VariableManager get_vars() 34052 1727204448.38510: Calling all_inventory to load vars for managed-node1 34052 1727204448.38512: Calling groups_inventory to load vars for managed-node1 34052 1727204448.38515: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.38526: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.38657: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.38663: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.39379: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000075 34052 1727204448.39383: WORKER PROCESS EXITING 34052 1727204448.40806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.43835: done with get_vars() 34052 1727204448.43880: done getting variables 34052 1727204448.43953: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.141) 0:00:34.757 ***** 34052 1727204448.43996: entering _queue_task() for managed-node1/package 34052 1727204448.44407: worker is 1 (out of 1 available) 34052 1727204448.44428: exiting _queue_task() for managed-node1/package 34052 1727204448.44442: done queuing things up, now waiting for results queue to drain 34052 1727204448.44444: waiting for pending results... 34052 1727204448.44758: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 34052 1727204448.44924: in run() - task 127b8e07-fff9-66a4-e2a3-000000000076 34052 1727204448.44951: variable 'ansible_search_path' from source: unknown 34052 1727204448.44958: variable 'ansible_search_path' from source: unknown 34052 1727204448.45007: calling self._execute() 34052 1727204448.45121: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.45136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.45152: variable 'omit' from source: magic vars 34052 1727204448.45586: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.45607: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.45842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204448.46147: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204448.46207: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204448.46252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204448.46343: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204448.46483: variable 'network_packages' from source: role '' defaults 34052 1727204448.46616: variable '__network_provider_setup' from source: role '' defaults 34052 1727204448.46643: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204448.46721: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204448.46744: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204448.46817: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204448.47047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.49466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.49673: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.49677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.49680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.49682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.49762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.49804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.49840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.49890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.49914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.49974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.50005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.50049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.50103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.50131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.50414: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34052 1727204448.50673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.50678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.50681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.50683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.50695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.50806: variable 'ansible_python' from source: facts 34052 1727204448.50846: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34052 1727204448.50952: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204448.51050: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204448.51228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.51261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.51296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.51351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.51375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.51442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.51486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.51518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.51578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.51652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.51775: variable 'network_connections' from source: task vars 34052 1727204448.51787: variable 'interface' from source: play vars 34052 1727204448.51904: variable 'interface' from source: play vars 34052 1727204448.51995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204448.52033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204448.52073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.52116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204448.52177: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204448.52636: variable 'network_connections' from source: task vars 34052 1727204448.52641: variable 'interface' from source: play vars 34052 1727204448.52697: variable 'interface' from source: play vars 34052 1727204448.52749: variable '__network_packages_default_wireless' from source: role '' defaults 34052 1727204448.52850: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204448.53206: variable 'network_connections' from source: task vars 34052 1727204448.53218: variable 'interface' from source: play vars 34052 1727204448.53303: variable 'interface' from source: play vars 34052 1727204448.53337: variable '__network_packages_default_team' from source: role '' defaults 34052 1727204448.53473: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204448.53799: variable 'network_connections' from source: task vars 34052 1727204448.53813: variable 'interface' from source: play vars 34052 1727204448.53895: variable 'interface' from source: play vars 34052 1727204448.53972: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204448.54044: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204448.54272: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204448.54275: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204448.54382: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34052 1727204448.54964: variable 'network_connections' from source: task vars 34052 1727204448.54984: variable 'interface' from source: play vars 34052 1727204448.55073: variable 'interface' from source: play vars 34052 1727204448.55090: variable 'ansible_distribution' from source: facts 34052 1727204448.55100: variable '__network_rh_distros' from source: role '' defaults 34052 1727204448.55112: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.55136: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34052 1727204448.55336: variable 'ansible_distribution' from source: facts 34052 1727204448.55348: variable '__network_rh_distros' from source: role '' defaults 34052 1727204448.55353: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.55360: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34052 1727204448.55488: variable 'ansible_distribution' from source: facts 34052 1727204448.55492: variable '__network_rh_distros' from source: role '' defaults 34052 1727204448.55497: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.55535: variable 'network_provider' from source: set_fact 34052 1727204448.55552: variable 'ansible_facts' from source: unknown 34052 1727204448.56128: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34052 1727204448.56132: when evaluation is False, skipping this task 34052 1727204448.56135: _execute() done 34052 1727204448.56137: dumping result to json 34052 1727204448.56139: done dumping result, returning 34052 1727204448.56148: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-66a4-e2a3-000000000076] 34052 1727204448.56151: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000076 34052 1727204448.56251: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000076 34052 1727204448.56254: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34052 1727204448.56312: no more pending results, returning what we have 34052 1727204448.56316: results queue empty 34052 1727204448.56317: checking for any_errors_fatal 34052 1727204448.56325: done checking for any_errors_fatal 34052 1727204448.56326: checking for max_fail_percentage 34052 1727204448.56328: done checking for max_fail_percentage 34052 1727204448.56329: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.56330: done checking to see if all hosts have failed 34052 1727204448.56330: getting the remaining hosts for this loop 34052 1727204448.56332: done getting the remaining hosts for this loop 34052 1727204448.56336: getting the next task for host managed-node1 34052 1727204448.56343: done getting next task for host managed-node1 34052 1727204448.56347: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34052 1727204448.56350: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.56381: getting variables 34052 1727204448.56383: in VariableManager get_vars() 34052 1727204448.56427: Calling all_inventory to load vars for managed-node1 34052 1727204448.56430: Calling groups_inventory to load vars for managed-node1 34052 1727204448.56432: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.56443: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.56446: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.56449: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.57713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.59652: done with get_vars() 34052 1727204448.59685: done getting variables 34052 1727204448.59736: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.157) 0:00:34.914 ***** 34052 1727204448.59767: entering _queue_task() for managed-node1/package 34052 1727204448.60064: worker is 1 (out of 1 available) 34052 1727204448.60083: exiting _queue_task() for managed-node1/package 34052 1727204448.60098: done queuing things up, now waiting for results queue to drain 34052 1727204448.60100: waiting for pending results... 34052 1727204448.60323: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34052 1727204448.60420: in run() - task 127b8e07-fff9-66a4-e2a3-000000000077 34052 1727204448.60436: variable 'ansible_search_path' from source: unknown 34052 1727204448.60440: variable 'ansible_search_path' from source: unknown 34052 1727204448.60475: calling self._execute() 34052 1727204448.60564: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.60573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.60582: variable 'omit' from source: magic vars 34052 1727204448.60897: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.60903: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.61000: variable 'network_state' from source: role '' defaults 34052 1727204448.61012: Evaluated conditional (network_state != {}): False 34052 1727204448.61015: when evaluation is False, skipping this task 34052 1727204448.61018: _execute() done 34052 1727204448.61021: dumping result to json 34052 1727204448.61023: done dumping result, returning 34052 1727204448.61034: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-66a4-e2a3-000000000077] 34052 1727204448.61037: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000077 34052 1727204448.61138: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000077 34052 1727204448.61142: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204448.61194: no more pending results, returning what we have 34052 1727204448.61198: results queue empty 34052 1727204448.61199: checking for any_errors_fatal 34052 1727204448.61211: done checking for any_errors_fatal 34052 1727204448.61211: checking for max_fail_percentage 34052 1727204448.61213: done checking for max_fail_percentage 34052 1727204448.61214: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.61215: done checking to see if all hosts have failed 34052 1727204448.61216: getting the remaining hosts for this loop 34052 1727204448.61217: done getting the remaining hosts for this loop 34052 1727204448.61222: getting the next task for host managed-node1 34052 1727204448.61229: done getting next task for host managed-node1 34052 1727204448.61233: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34052 1727204448.61237: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.61258: getting variables 34052 1727204448.61259: in VariableManager get_vars() 34052 1727204448.61302: Calling all_inventory to load vars for managed-node1 34052 1727204448.61305: Calling groups_inventory to load vars for managed-node1 34052 1727204448.61307: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.61318: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.61321: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.61324: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.69228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.71393: done with get_vars() 34052 1727204448.71434: done getting variables 34052 1727204448.71500: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.117) 0:00:35.032 ***** 34052 1727204448.71536: entering _queue_task() for managed-node1/package 34052 1727204448.71943: worker is 1 (out of 1 available) 34052 1727204448.71958: exiting _queue_task() for managed-node1/package 34052 1727204448.72178: done queuing things up, now waiting for results queue to drain 34052 1727204448.72180: waiting for pending results... 34052 1727204448.72424: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34052 1727204448.72517: in run() - task 127b8e07-fff9-66a4-e2a3-000000000078 34052 1727204448.72571: variable 'ansible_search_path' from source: unknown 34052 1727204448.72575: variable 'ansible_search_path' from source: unknown 34052 1727204448.72598: calling self._execute() 34052 1727204448.72717: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.72735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.72770: variable 'omit' from source: magic vars 34052 1727204448.73205: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.73227: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.73383: variable 'network_state' from source: role '' defaults 34052 1727204448.73479: Evaluated conditional (network_state != {}): False 34052 1727204448.73483: when evaluation is False, skipping this task 34052 1727204448.73487: _execute() done 34052 1727204448.73490: dumping result to json 34052 1727204448.73493: done dumping result, returning 34052 1727204448.73496: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-66a4-e2a3-000000000078] 34052 1727204448.73499: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000078 34052 1727204448.73587: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000078 34052 1727204448.73590: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204448.73647: no more pending results, returning what we have 34052 1727204448.73654: results queue empty 34052 1727204448.73655: checking for any_errors_fatal 34052 1727204448.73671: done checking for any_errors_fatal 34052 1727204448.73672: checking for max_fail_percentage 34052 1727204448.73674: done checking for max_fail_percentage 34052 1727204448.73675: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.73675: done checking to see if all hosts have failed 34052 1727204448.73676: getting the remaining hosts for this loop 34052 1727204448.73678: done getting the remaining hosts for this loop 34052 1727204448.73683: getting the next task for host managed-node1 34052 1727204448.73693: done getting next task for host managed-node1 34052 1727204448.73698: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34052 1727204448.73701: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.73726: getting variables 34052 1727204448.73728: in VariableManager get_vars() 34052 1727204448.73884: Calling all_inventory to load vars for managed-node1 34052 1727204448.73888: Calling groups_inventory to load vars for managed-node1 34052 1727204448.73891: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.73907: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.73911: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.73915: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.75917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.78057: done with get_vars() 34052 1727204448.78097: done getting variables 34052 1727204448.78167: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.066) 0:00:35.099 ***** 34052 1727204448.78209: entering _queue_task() for managed-node1/service 34052 1727204448.78605: worker is 1 (out of 1 available) 34052 1727204448.78622: exiting _queue_task() for managed-node1/service 34052 1727204448.78637: done queuing things up, now waiting for results queue to drain 34052 1727204448.78639: waiting for pending results... 34052 1727204448.79091: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34052 1727204448.79186: in run() - task 127b8e07-fff9-66a4-e2a3-000000000079 34052 1727204448.79192: variable 'ansible_search_path' from source: unknown 34052 1727204448.79195: variable 'ansible_search_path' from source: unknown 34052 1727204448.79224: calling self._execute() 34052 1727204448.79344: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.79403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.79407: variable 'omit' from source: magic vars 34052 1727204448.79810: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.79828: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.79978: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204448.80206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.82676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.82783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.82833: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.82881: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.82920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.83022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.83060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.83095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.83150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.83173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.83239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.83272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.83305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.83358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.83381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.83438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.83471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.83503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.83554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.83652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.83787: variable 'network_connections' from source: task vars 34052 1727204448.83807: variable 'interface' from source: play vars 34052 1727204448.83889: variable 'interface' from source: play vars 34052 1727204448.83982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204448.84178: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204448.84242: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204448.84280: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204448.84319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204448.84525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204448.84529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204448.84532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.84743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204448.84747: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204448.85339: variable 'network_connections' from source: task vars 34052 1727204448.85411: variable 'interface' from source: play vars 34052 1727204448.85532: variable 'interface' from source: play vars 34052 1727204448.85650: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34052 1727204448.85660: when evaluation is False, skipping this task 34052 1727204448.85671: _execute() done 34052 1727204448.85682: dumping result to json 34052 1727204448.85690: done dumping result, returning 34052 1727204448.85703: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-66a4-e2a3-000000000079] 34052 1727204448.85736: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000079 34052 1727204448.86148: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000079 34052 1727204448.86158: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34052 1727204448.86226: no more pending results, returning what we have 34052 1727204448.86230: results queue empty 34052 1727204448.86230: checking for any_errors_fatal 34052 1727204448.86238: done checking for any_errors_fatal 34052 1727204448.86239: checking for max_fail_percentage 34052 1727204448.86240: done checking for max_fail_percentage 34052 1727204448.86241: checking to see if all hosts have failed and the running result is not ok 34052 1727204448.86242: done checking to see if all hosts have failed 34052 1727204448.86243: getting the remaining hosts for this loop 34052 1727204448.86245: done getting the remaining hosts for this loop 34052 1727204448.86250: getting the next task for host managed-node1 34052 1727204448.86256: done getting next task for host managed-node1 34052 1727204448.86261: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34052 1727204448.86264: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204448.86286: getting variables 34052 1727204448.86288: in VariableManager get_vars() 34052 1727204448.86331: Calling all_inventory to load vars for managed-node1 34052 1727204448.86334: Calling groups_inventory to load vars for managed-node1 34052 1727204448.86336: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204448.86348: Calling all_plugins_play to load vars for managed-node1 34052 1727204448.86351: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204448.86355: Calling groups_plugins_play to load vars for managed-node1 34052 1727204448.90201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204448.92637: done with get_vars() 34052 1727204448.92681: done getting variables 34052 1727204448.92749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.145) 0:00:35.245 ***** 34052 1727204448.92786: entering _queue_task() for managed-node1/service 34052 1727204448.93220: worker is 1 (out of 1 available) 34052 1727204448.93236: exiting _queue_task() for managed-node1/service 34052 1727204448.93253: done queuing things up, now waiting for results queue to drain 34052 1727204448.93255: waiting for pending results... 34052 1727204448.93586: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34052 1727204448.93753: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007a 34052 1727204448.93780: variable 'ansible_search_path' from source: unknown 34052 1727204448.93788: variable 'ansible_search_path' from source: unknown 34052 1727204448.93839: calling self._execute() 34052 1727204448.93961: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204448.93976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204448.93990: variable 'omit' from source: magic vars 34052 1727204448.94431: variable 'ansible_distribution_major_version' from source: facts 34052 1727204448.94451: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204448.94655: variable 'network_provider' from source: set_fact 34052 1727204448.94682: variable 'network_state' from source: role '' defaults 34052 1727204448.94871: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34052 1727204448.94875: variable 'omit' from source: magic vars 34052 1727204448.94877: variable 'omit' from source: magic vars 34052 1727204448.94880: variable 'network_service_name' from source: role '' defaults 34052 1727204448.94886: variable 'network_service_name' from source: role '' defaults 34052 1727204448.95010: variable '__network_provider_setup' from source: role '' defaults 34052 1727204448.95023: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204448.95087: variable '__network_service_name_default_nm' from source: role '' defaults 34052 1727204448.95103: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204448.95163: variable '__network_packages_default_nm' from source: role '' defaults 34052 1727204448.95417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204448.98264: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204448.98371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204448.98423: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204448.98473: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204448.98513: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204448.98614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.98653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.98695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.98745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.98769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.98834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.98867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.98909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.98952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.99019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.99246: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34052 1727204448.99384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204448.99413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204448.99441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204448.99492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204448.99511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204448.99670: variable 'ansible_python' from source: facts 34052 1727204448.99675: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34052 1727204448.99753: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204448.99848: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204448.99993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.00027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.00058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.00112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.00133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.00208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.00476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.00480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.00483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.00485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.00495: variable 'network_connections' from source: task vars 34052 1727204449.00509: variable 'interface' from source: play vars 34052 1727204449.00610: variable 'interface' from source: play vars 34052 1727204449.00739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204449.00975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204449.01034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204449.01088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204449.01135: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204449.01258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204449.01262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204449.01293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.01332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204449.01394: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204449.01728: variable 'network_connections' from source: task vars 34052 1727204449.01743: variable 'interface' from source: play vars 34052 1727204449.01835: variable 'interface' from source: play vars 34052 1727204449.01908: variable '__network_packages_default_wireless' from source: role '' defaults 34052 1727204449.01974: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204449.02310: variable 'network_connections' from source: task vars 34052 1727204449.02320: variable 'interface' from source: play vars 34052 1727204449.02403: variable 'interface' from source: play vars 34052 1727204449.02435: variable '__network_packages_default_team' from source: role '' defaults 34052 1727204449.02561: variable '__network_team_connections_defined' from source: role '' defaults 34052 1727204449.02867: variable 'network_connections' from source: task vars 34052 1727204449.02879: variable 'interface' from source: play vars 34052 1727204449.02956: variable 'interface' from source: play vars 34052 1727204449.03028: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204449.03107: variable '__network_service_name_default_initscripts' from source: role '' defaults 34052 1727204449.03113: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204449.03215: variable '__network_packages_default_initscripts' from source: role '' defaults 34052 1727204449.03425: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34052 1727204449.03997: variable 'network_connections' from source: task vars 34052 1727204449.04011: variable 'interface' from source: play vars 34052 1727204449.04087: variable 'interface' from source: play vars 34052 1727204449.04100: variable 'ansible_distribution' from source: facts 34052 1727204449.04176: variable '__network_rh_distros' from source: role '' defaults 34052 1727204449.04179: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.04182: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34052 1727204449.04342: variable 'ansible_distribution' from source: facts 34052 1727204449.04351: variable '__network_rh_distros' from source: role '' defaults 34052 1727204449.04360: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.04374: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34052 1727204449.04600: variable 'ansible_distribution' from source: facts 34052 1727204449.04609: variable '__network_rh_distros' from source: role '' defaults 34052 1727204449.04622: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.04669: variable 'network_provider' from source: set_fact 34052 1727204449.04700: variable 'omit' from source: magic vars 34052 1727204449.04741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204449.04847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204449.04851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204449.04853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204449.04855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204449.04883: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204449.04891: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.04899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.05011: Set connection var ansible_connection to ssh 34052 1727204449.05025: Set connection var ansible_timeout to 10 34052 1727204449.05037: Set connection var ansible_pipelining to False 34052 1727204449.05045: Set connection var ansible_shell_type to sh 34052 1727204449.05060: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204449.05078: Set connection var ansible_shell_executable to /bin/sh 34052 1727204449.05113: variable 'ansible_shell_executable' from source: unknown 34052 1727204449.05121: variable 'ansible_connection' from source: unknown 34052 1727204449.05129: variable 'ansible_module_compression' from source: unknown 34052 1727204449.05173: variable 'ansible_shell_type' from source: unknown 34052 1727204449.05176: variable 'ansible_shell_executable' from source: unknown 34052 1727204449.05178: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.05180: variable 'ansible_pipelining' from source: unknown 34052 1727204449.05182: variable 'ansible_timeout' from source: unknown 34052 1727204449.05185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.05298: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204449.05316: variable 'omit' from source: magic vars 34052 1727204449.05370: starting attempt loop 34052 1727204449.05374: running the handler 34052 1727204449.05445: variable 'ansible_facts' from source: unknown 34052 1727204449.06424: _low_level_execute_command(): starting 34052 1727204449.06438: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204449.07260: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.07302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.07333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.07378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.07439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.09236: stdout chunk (state=3): >>>/root <<< 34052 1727204449.09453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.09457: stdout chunk (state=3): >>><<< 34052 1727204449.09460: stderr chunk (state=3): >>><<< 34052 1727204449.09482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.09501: _low_level_execute_command(): starting 34052 1727204449.09596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477 `" && echo ansible-tmp-1727204449.0948877-36396-193902929468477="` echo /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477 `" ) && sleep 0' 34052 1727204449.10236: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.10254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.10371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.10392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.10412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.10436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.10522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.12612: stdout chunk (state=3): >>>ansible-tmp-1727204449.0948877-36396-193902929468477=/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477 <<< 34052 1727204449.12819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.12833: stdout chunk (state=3): >>><<< 34052 1727204449.12854: stderr chunk (state=3): >>><<< 34052 1727204449.12880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204449.0948877-36396-193902929468477=/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.12925: variable 'ansible_module_compression' from source: unknown 34052 1727204449.12998: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 34052 1727204449.13081: variable 'ansible_facts' from source: unknown 34052 1727204449.13377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py 34052 1727204449.13505: Sending initial data 34052 1727204449.13515: Sent initial data (156 bytes) 34052 1727204449.14373: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.14378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.14437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.14483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.16186: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204449.16218: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204449.16264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204449.16334: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpp9kfks19 /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py <<< 34052 1727204449.16337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py" <<< 34052 1727204449.16380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpp9kfks19" to remote "/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py" <<< 34052 1727204449.18451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.18455: stdout chunk (state=3): >>><<< 34052 1727204449.18458: stderr chunk (state=3): >>><<< 34052 1727204449.18460: done transferring module to remote 34052 1727204449.18462: _low_level_execute_command(): starting 34052 1727204449.18468: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/ /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py && sleep 0' 34052 1727204449.19082: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.19090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.19101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.19117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204449.19131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204449.19139: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204449.19149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.19162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204449.19172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204449.19179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204449.19187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.19196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.19209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204449.19217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204449.19224: stderr chunk (state=3): >>>debug2: match found <<< 34052 1727204449.19237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.19313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.19330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.19336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.19420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.21463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.21470: stdout chunk (state=3): >>><<< 34052 1727204449.21473: stderr chunk (state=3): >>><<< 34052 1727204449.21492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.21589: _low_level_execute_command(): starting 34052 1727204449.21593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/AnsiballZ_systemd.py && sleep 0' 34052 1727204449.22227: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.22282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.22371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.22394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.22498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.55218: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "673", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ExecMainStartTimestampMonotonic": "31464158", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "673", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11898880", "MemoryPeak": "13578240", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3487178752", "CPUUsageNSec": "1734896000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target", "After": "systemd-journald.socket dbus-broker.service basic.target network-pre.target sysinit.target dbus.socket system.slice cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 34052 1727204449.55233: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:13 EDT", "StateChangeTimestampMonotonic": "359196339", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:44 EDT", "InactiveExitTimestampMonotonic": "31464340", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:46 EDT", "ActiveEnterTimestampMonotonic": "32958713", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ConditionTimestampMonotonic": "31456341", "AssertTimestamp": "Tue 2024-09-24 14:48:44 EDT", "AssertTimestampMonotonic": "31456345", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "0b953f6a210e485cbebf0a8e98fe18d8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34052 1727204449.57430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204449.57435: stdout chunk (state=3): >>><<< 34052 1727204449.57437: stderr chunk (state=3): >>><<< 34052 1727204449.57457: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "673", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ExecMainStartTimestampMonotonic": "31464158", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "673", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11898880", "MemoryPeak": "13578240", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3487178752", "CPUUsageNSec": "1734896000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target", "After": "systemd-journald.socket dbus-broker.service basic.target network-pre.target sysinit.target dbus.socket system.slice cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:13 EDT", "StateChangeTimestampMonotonic": "359196339", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:44 EDT", "InactiveExitTimestampMonotonic": "31464340", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:46 EDT", "ActiveEnterTimestampMonotonic": "32958713", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:44 EDT", "ConditionTimestampMonotonic": "31456341", "AssertTimestamp": "Tue 2024-09-24 14:48:44 EDT", "AssertTimestampMonotonic": "31456345", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "0b953f6a210e485cbebf0a8e98fe18d8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204449.57783: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204449.57787: _low_level_execute_command(): starting 34052 1727204449.57789: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204449.0948877-36396-193902929468477/ > /dev/null 2>&1 && sleep 0' 34052 1727204449.58604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.58678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.58710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.58733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.58830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.60880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.60885: stdout chunk (state=3): >>><<< 34052 1727204449.60894: stderr chunk (state=3): >>><<< 34052 1727204449.60989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.60992: handler run complete 34052 1727204449.60995: attempt loop complete, returning result 34052 1727204449.60997: _execute() done 34052 1727204449.60999: dumping result to json 34052 1727204449.61001: done dumping result, returning 34052 1727204449.61003: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-66a4-e2a3-00000000007a] 34052 1727204449.61006: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007a ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204449.61556: no more pending results, returning what we have 34052 1727204449.61559: results queue empty 34052 1727204449.61560: checking for any_errors_fatal 34052 1727204449.61564: done checking for any_errors_fatal 34052 1727204449.61566: checking for max_fail_percentage 34052 1727204449.61568: done checking for max_fail_percentage 34052 1727204449.61569: checking to see if all hosts have failed and the running result is not ok 34052 1727204449.61574: done checking to see if all hosts have failed 34052 1727204449.61575: getting the remaining hosts for this loop 34052 1727204449.61576: done getting the remaining hosts for this loop 34052 1727204449.61580: getting the next task for host managed-node1 34052 1727204449.61585: done getting next task for host managed-node1 34052 1727204449.61589: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34052 1727204449.61600: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204449.61615: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007a 34052 1727204449.61618: WORKER PROCESS EXITING 34052 1727204449.61623: getting variables 34052 1727204449.61624: in VariableManager get_vars() 34052 1727204449.61663: Calling all_inventory to load vars for managed-node1 34052 1727204449.61668: Calling groups_inventory to load vars for managed-node1 34052 1727204449.61671: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204449.61684: Calling all_plugins_play to load vars for managed-node1 34052 1727204449.61687: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204449.61691: Calling groups_plugins_play to load vars for managed-node1 34052 1727204449.63188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204449.64442: done with get_vars() 34052 1727204449.64471: done getting variables 34052 1727204449.64531: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.717) 0:00:35.962 ***** 34052 1727204449.64560: entering _queue_task() for managed-node1/service 34052 1727204449.64874: worker is 1 (out of 1 available) 34052 1727204449.64893: exiting _queue_task() for managed-node1/service 34052 1727204449.64906: done queuing things up, now waiting for results queue to drain 34052 1727204449.64908: waiting for pending results... 34052 1727204449.65384: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34052 1727204449.65391: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007b 34052 1727204449.65394: variable 'ansible_search_path' from source: unknown 34052 1727204449.65397: variable 'ansible_search_path' from source: unknown 34052 1727204449.65456: calling self._execute() 34052 1727204449.65581: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.65597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.65618: variable 'omit' from source: magic vars 34052 1727204449.66073: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.66093: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204449.66228: variable 'network_provider' from source: set_fact 34052 1727204449.66241: Evaluated conditional (network_provider == "nm"): True 34052 1727204449.66350: variable '__network_wpa_supplicant_required' from source: role '' defaults 34052 1727204449.66473: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34052 1727204449.66663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204449.68581: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204449.68682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204449.68686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204449.68871: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204449.68875: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204449.68879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.68890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.68916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.68963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.68984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.69042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.69070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.69094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.69141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.69159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.69221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.69239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.69262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.69296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.69307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.69455: variable 'network_connections' from source: task vars 34052 1727204449.69470: variable 'interface' from source: play vars 34052 1727204449.69541: variable 'interface' from source: play vars 34052 1727204449.69657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204449.69805: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204449.69849: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204449.69883: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204449.69913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204449.69964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34052 1727204449.69987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34052 1727204449.70016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.70047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34052 1727204449.70101: variable '__network_wireless_connections_defined' from source: role '' defaults 34052 1727204449.70380: variable 'network_connections' from source: task vars 34052 1727204449.70384: variable 'interface' from source: play vars 34052 1727204449.70531: variable 'interface' from source: play vars 34052 1727204449.70535: Evaluated conditional (__network_wpa_supplicant_required): False 34052 1727204449.70537: when evaluation is False, skipping this task 34052 1727204449.70539: _execute() done 34052 1727204449.70542: dumping result to json 34052 1727204449.70544: done dumping result, returning 34052 1727204449.70546: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-66a4-e2a3-00000000007b] 34052 1727204449.70557: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007b 34052 1727204449.70629: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007b 34052 1727204449.70633: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34052 1727204449.70687: no more pending results, returning what we have 34052 1727204449.70691: results queue empty 34052 1727204449.70691: checking for any_errors_fatal 34052 1727204449.70716: done checking for any_errors_fatal 34052 1727204449.70716: checking for max_fail_percentage 34052 1727204449.70718: done checking for max_fail_percentage 34052 1727204449.70719: checking to see if all hosts have failed and the running result is not ok 34052 1727204449.70720: done checking to see if all hosts have failed 34052 1727204449.70721: getting the remaining hosts for this loop 34052 1727204449.70722: done getting the remaining hosts for this loop 34052 1727204449.70727: getting the next task for host managed-node1 34052 1727204449.70732: done getting next task for host managed-node1 34052 1727204449.70737: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34052 1727204449.70740: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204449.70761: getting variables 34052 1727204449.70763: in VariableManager get_vars() 34052 1727204449.70806: Calling all_inventory to load vars for managed-node1 34052 1727204449.70809: Calling groups_inventory to load vars for managed-node1 34052 1727204449.70811: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204449.70822: Calling all_plugins_play to load vars for managed-node1 34052 1727204449.70824: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204449.70828: Calling groups_plugins_play to load vars for managed-node1 34052 1727204449.72110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204449.73638: done with get_vars() 34052 1727204449.73678: done getting variables 34052 1727204449.73749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.092) 0:00:36.055 ***** 34052 1727204449.73789: entering _queue_task() for managed-node1/service 34052 1727204449.74189: worker is 1 (out of 1 available) 34052 1727204449.74205: exiting _queue_task() for managed-node1/service 34052 1727204449.74219: done queuing things up, now waiting for results queue to drain 34052 1727204449.74221: waiting for pending results... 34052 1727204449.74485: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 34052 1727204449.74580: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007c 34052 1727204449.74594: variable 'ansible_search_path' from source: unknown 34052 1727204449.74598: variable 'ansible_search_path' from source: unknown 34052 1727204449.74635: calling self._execute() 34052 1727204449.74722: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.74730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.74741: variable 'omit' from source: magic vars 34052 1727204449.75063: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.75074: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204449.75163: variable 'network_provider' from source: set_fact 34052 1727204449.75169: Evaluated conditional (network_provider == "initscripts"): False 34052 1727204449.75172: when evaluation is False, skipping this task 34052 1727204449.75175: _execute() done 34052 1727204449.75178: dumping result to json 34052 1727204449.75181: done dumping result, returning 34052 1727204449.75190: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-66a4-e2a3-00000000007c] 34052 1727204449.75195: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007c 34052 1727204449.75296: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007c 34052 1727204449.75299: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34052 1727204449.75347: no more pending results, returning what we have 34052 1727204449.75350: results queue empty 34052 1727204449.75351: checking for any_errors_fatal 34052 1727204449.75362: done checking for any_errors_fatal 34052 1727204449.75363: checking for max_fail_percentage 34052 1727204449.75364: done checking for max_fail_percentage 34052 1727204449.75367: checking to see if all hosts have failed and the running result is not ok 34052 1727204449.75367: done checking to see if all hosts have failed 34052 1727204449.75368: getting the remaining hosts for this loop 34052 1727204449.75370: done getting the remaining hosts for this loop 34052 1727204449.75374: getting the next task for host managed-node1 34052 1727204449.75380: done getting next task for host managed-node1 34052 1727204449.75385: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34052 1727204449.75389: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204449.75410: getting variables 34052 1727204449.75412: in VariableManager get_vars() 34052 1727204449.75455: Calling all_inventory to load vars for managed-node1 34052 1727204449.75458: Calling groups_inventory to load vars for managed-node1 34052 1727204449.75460: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204449.75478: Calling all_plugins_play to load vars for managed-node1 34052 1727204449.75481: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204449.75485: Calling groups_plugins_play to load vars for managed-node1 34052 1727204449.76532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204449.77877: done with get_vars() 34052 1727204449.77901: done getting variables 34052 1727204449.77955: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.041) 0:00:36.097 ***** 34052 1727204449.77988: entering _queue_task() for managed-node1/copy 34052 1727204449.78293: worker is 1 (out of 1 available) 34052 1727204449.78308: exiting _queue_task() for managed-node1/copy 34052 1727204449.78321: done queuing things up, now waiting for results queue to drain 34052 1727204449.78323: waiting for pending results... 34052 1727204449.78535: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34052 1727204449.78650: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007d 34052 1727204449.78665: variable 'ansible_search_path' from source: unknown 34052 1727204449.78671: variable 'ansible_search_path' from source: unknown 34052 1727204449.78705: calling self._execute() 34052 1727204449.78794: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.78799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.78810: variable 'omit' from source: magic vars 34052 1727204449.79134: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.79143: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204449.79238: variable 'network_provider' from source: set_fact 34052 1727204449.79243: Evaluated conditional (network_provider == "initscripts"): False 34052 1727204449.79246: when evaluation is False, skipping this task 34052 1727204449.79249: _execute() done 34052 1727204449.79254: dumping result to json 34052 1727204449.79256: done dumping result, returning 34052 1727204449.79268: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-66a4-e2a3-00000000007d] 34052 1727204449.79271: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007d 34052 1727204449.79377: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007d 34052 1727204449.79380: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34052 1727204449.79433: no more pending results, returning what we have 34052 1727204449.79437: results queue empty 34052 1727204449.79438: checking for any_errors_fatal 34052 1727204449.79445: done checking for any_errors_fatal 34052 1727204449.79445: checking for max_fail_percentage 34052 1727204449.79447: done checking for max_fail_percentage 34052 1727204449.79448: checking to see if all hosts have failed and the running result is not ok 34052 1727204449.79448: done checking to see if all hosts have failed 34052 1727204449.79449: getting the remaining hosts for this loop 34052 1727204449.79451: done getting the remaining hosts for this loop 34052 1727204449.79455: getting the next task for host managed-node1 34052 1727204449.79462: done getting next task for host managed-node1 34052 1727204449.79468: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34052 1727204449.79471: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204449.79494: getting variables 34052 1727204449.79495: in VariableManager get_vars() 34052 1727204449.79539: Calling all_inventory to load vars for managed-node1 34052 1727204449.79542: Calling groups_inventory to load vars for managed-node1 34052 1727204449.79544: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204449.79555: Calling all_plugins_play to load vars for managed-node1 34052 1727204449.79557: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204449.79560: Calling groups_plugins_play to load vars for managed-node1 34052 1727204449.80585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204449.81784: done with get_vars() 34052 1727204449.81813: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.039) 0:00:36.136 ***** 34052 1727204449.81891: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34052 1727204449.82187: worker is 1 (out of 1 available) 34052 1727204449.82202: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34052 1727204449.82217: done queuing things up, now waiting for results queue to drain 34052 1727204449.82219: waiting for pending results... 34052 1727204449.82428: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34052 1727204449.82531: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007e 34052 1727204449.82548: variable 'ansible_search_path' from source: unknown 34052 1727204449.82555: variable 'ansible_search_path' from source: unknown 34052 1727204449.82590: calling self._execute() 34052 1727204449.82682: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.82688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.82698: variable 'omit' from source: magic vars 34052 1727204449.83017: variable 'ansible_distribution_major_version' from source: facts 34052 1727204449.83026: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204449.83035: variable 'omit' from source: magic vars 34052 1727204449.83084: variable 'omit' from source: magic vars 34052 1727204449.83226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204449.85216: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204449.85270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204449.85301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204449.85327: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204449.85349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204449.85419: variable 'network_provider' from source: set_fact 34052 1727204449.85533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204449.85554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204449.85575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204449.85603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204449.85621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204449.85679: variable 'omit' from source: magic vars 34052 1727204449.85770: variable 'omit' from source: magic vars 34052 1727204449.85851: variable 'network_connections' from source: task vars 34052 1727204449.85861: variable 'interface' from source: play vars 34052 1727204449.85910: variable 'interface' from source: play vars 34052 1727204449.86022: variable 'omit' from source: magic vars 34052 1727204449.86031: variable '__lsr_ansible_managed' from source: task vars 34052 1727204449.86081: variable '__lsr_ansible_managed' from source: task vars 34052 1727204449.86306: Loaded config def from plugin (lookup/template) 34052 1727204449.86310: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34052 1727204449.86337: File lookup term: get_ansible_managed.j2 34052 1727204449.86340: variable 'ansible_search_path' from source: unknown 34052 1727204449.86346: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34052 1727204449.86358: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34052 1727204449.86374: variable 'ansible_search_path' from source: unknown 34052 1727204449.90785: variable 'ansible_managed' from source: unknown 34052 1727204449.90914: variable 'omit' from source: magic vars 34052 1727204449.90946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204449.90970: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204449.90988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204449.91002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204449.91012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204449.91041: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204449.91044: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.91047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.91117: Set connection var ansible_connection to ssh 34052 1727204449.91124: Set connection var ansible_timeout to 10 34052 1727204449.91132: Set connection var ansible_pipelining to False 34052 1727204449.91135: Set connection var ansible_shell_type to sh 34052 1727204449.91142: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204449.91149: Set connection var ansible_shell_executable to /bin/sh 34052 1727204449.91175: variable 'ansible_shell_executable' from source: unknown 34052 1727204449.91178: variable 'ansible_connection' from source: unknown 34052 1727204449.91180: variable 'ansible_module_compression' from source: unknown 34052 1727204449.91199: variable 'ansible_shell_type' from source: unknown 34052 1727204449.91202: variable 'ansible_shell_executable' from source: unknown 34052 1727204449.91205: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204449.91207: variable 'ansible_pipelining' from source: unknown 34052 1727204449.91210: variable 'ansible_timeout' from source: unknown 34052 1727204449.91212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204449.91323: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204449.91335: variable 'omit' from source: magic vars 34052 1727204449.91341: starting attempt loop 34052 1727204449.91344: running the handler 34052 1727204449.91356: _low_level_execute_command(): starting 34052 1727204449.91363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204449.91908: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.91914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.91918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.91974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.91981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.92036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.93844: stdout chunk (state=3): >>>/root <<< 34052 1727204449.94003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.94050: stderr chunk (state=3): >>><<< 34052 1727204449.94068: stdout chunk (state=3): >>><<< 34052 1727204449.94114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.94140: _low_level_execute_command(): starting 34052 1727204449.94153: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616 `" && echo ansible-tmp-1727204449.941227-36420-125365061684616="` echo /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616 `" ) && sleep 0' 34052 1727204449.94982: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.94989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.94992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.94996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204449.94998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204449.95001: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204449.95003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.95005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204449.95007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204449.95009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34052 1727204449.95011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.95012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.95015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204449.95017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204449.95019: stderr chunk (state=3): >>>debug2: match found <<< 34052 1727204449.95021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.95124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.95128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204449.95130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.95218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204449.97273: stdout chunk (state=3): >>>ansible-tmp-1727204449.941227-36420-125365061684616=/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616 <<< 34052 1727204449.97773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204449.97777: stderr chunk (state=3): >>><<< 34052 1727204449.97780: stdout chunk (state=3): >>><<< 34052 1727204449.97785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204449.941227-36420-125365061684616=/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204449.97788: variable 'ansible_module_compression' from source: unknown 34052 1727204449.97791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 34052 1727204449.97928: variable 'ansible_facts' from source: unknown 34052 1727204449.98035: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py 34052 1727204449.98292: Sending initial data 34052 1727204449.98301: Sent initial data (167 bytes) 34052 1727204449.98852: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204449.98873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204449.98890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204449.98995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204449.99026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204449.99059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204449.99141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.00884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204450.00960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204450.01019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp3uj83qda /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py <<< 34052 1727204450.01022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py" <<< 34052 1727204450.01082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp3uj83qda" to remote "/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py" <<< 34052 1727204450.02290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.02486: stderr chunk (state=3): >>><<< 34052 1727204450.02492: stdout chunk (state=3): >>><<< 34052 1727204450.02495: done transferring module to remote 34052 1727204450.02497: _low_level_execute_command(): starting 34052 1727204450.02499: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/ /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py && sleep 0' 34052 1727204450.03374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found <<< 34052 1727204450.03379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.03398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.03422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.03501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.05494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.05720: stderr chunk (state=3): >>><<< 34052 1727204450.05724: stdout chunk (state=3): >>><<< 34052 1727204450.05731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204450.05734: _low_level_execute_command(): starting 34052 1727204450.05736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/AnsiballZ_network_connections.py && sleep 0' 34052 1727204450.06417: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204450.06436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.06524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204450.06542: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.06576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.06595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.06618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.06719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.44872: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ex0bglu4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 34052 1727204450.44886: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ex0bglu4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/cbb7d200-7555-4a5b-af25-f6d228b691ef: error=unknown <<< 34052 1727204450.45078: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34052 1727204450.47109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204450.47171: stderr chunk (state=3): >>><<< 34052 1727204450.47175: stdout chunk (state=3): >>><<< 34052 1727204450.47190: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ex0bglu4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ex0bglu4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/cbb7d200-7555-4a5b-af25-f6d228b691ef: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204450.47230: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204450.47237: _low_level_execute_command(): starting 34052 1727204450.47240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204449.941227-36420-125365061684616/ > /dev/null 2>&1 && sleep 0' 34052 1727204450.47752: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.47757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.47760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204450.47765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.47819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.47822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.47825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.47885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.49851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.49913: stderr chunk (state=3): >>><<< 34052 1727204450.49917: stdout chunk (state=3): >>><<< 34052 1727204450.49933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204450.49941: handler run complete 34052 1727204450.49962: attempt loop complete, returning result 34052 1727204450.49966: _execute() done 34052 1727204450.49969: dumping result to json 34052 1727204450.49974: done dumping result, returning 34052 1727204450.49982: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-66a4-e2a3-00000000007e] 34052 1727204450.49986: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007e 34052 1727204450.50103: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007e 34052 1727204450.50108: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 34052 1727204450.50211: no more pending results, returning what we have 34052 1727204450.50214: results queue empty 34052 1727204450.50215: checking for any_errors_fatal 34052 1727204450.50222: done checking for any_errors_fatal 34052 1727204450.50223: checking for max_fail_percentage 34052 1727204450.50225: done checking for max_fail_percentage 34052 1727204450.50225: checking to see if all hosts have failed and the running result is not ok 34052 1727204450.50226: done checking to see if all hosts have failed 34052 1727204450.50227: getting the remaining hosts for this loop 34052 1727204450.50229: done getting the remaining hosts for this loop 34052 1727204450.50233: getting the next task for host managed-node1 34052 1727204450.50238: done getting next task for host managed-node1 34052 1727204450.50242: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34052 1727204450.50246: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204450.50258: getting variables 34052 1727204450.50259: in VariableManager get_vars() 34052 1727204450.50306: Calling all_inventory to load vars for managed-node1 34052 1727204450.50309: Calling groups_inventory to load vars for managed-node1 34052 1727204450.50311: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204450.50322: Calling all_plugins_play to load vars for managed-node1 34052 1727204450.50325: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204450.50328: Calling groups_plugins_play to load vars for managed-node1 34052 1727204450.51486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204450.52651: done with get_vars() 34052 1727204450.52680: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.708) 0:00:36.844 ***** 34052 1727204450.52750: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34052 1727204450.53044: worker is 1 (out of 1 available) 34052 1727204450.53062: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34052 1727204450.53078: done queuing things up, now waiting for results queue to drain 34052 1727204450.53080: waiting for pending results... 34052 1727204450.53292: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 34052 1727204450.53396: in run() - task 127b8e07-fff9-66a4-e2a3-00000000007f 34052 1727204450.53411: variable 'ansible_search_path' from source: unknown 34052 1727204450.53415: variable 'ansible_search_path' from source: unknown 34052 1727204450.53451: calling self._execute() 34052 1727204450.53540: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.53551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.53561: variable 'omit' from source: magic vars 34052 1727204450.53882: variable 'ansible_distribution_major_version' from source: facts 34052 1727204450.53892: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204450.53993: variable 'network_state' from source: role '' defaults 34052 1727204450.54002: Evaluated conditional (network_state != {}): False 34052 1727204450.54006: when evaluation is False, skipping this task 34052 1727204450.54009: _execute() done 34052 1727204450.54012: dumping result to json 34052 1727204450.54014: done dumping result, returning 34052 1727204450.54024: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-66a4-e2a3-00000000007f] 34052 1727204450.54029: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007f 34052 1727204450.54124: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000007f 34052 1727204450.54130: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34052 1727204450.54187: no more pending results, returning what we have 34052 1727204450.54191: results queue empty 34052 1727204450.54192: checking for any_errors_fatal 34052 1727204450.54203: done checking for any_errors_fatal 34052 1727204450.54204: checking for max_fail_percentage 34052 1727204450.54205: done checking for max_fail_percentage 34052 1727204450.54206: checking to see if all hosts have failed and the running result is not ok 34052 1727204450.54207: done checking to see if all hosts have failed 34052 1727204450.54208: getting the remaining hosts for this loop 34052 1727204450.54210: done getting the remaining hosts for this loop 34052 1727204450.54214: getting the next task for host managed-node1 34052 1727204450.54220: done getting next task for host managed-node1 34052 1727204450.54224: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34052 1727204450.54230: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204450.54250: getting variables 34052 1727204450.54252: in VariableManager get_vars() 34052 1727204450.54300: Calling all_inventory to load vars for managed-node1 34052 1727204450.54303: Calling groups_inventory to load vars for managed-node1 34052 1727204450.54305: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204450.54316: Calling all_plugins_play to load vars for managed-node1 34052 1727204450.54318: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204450.54321: Calling groups_plugins_play to load vars for managed-node1 34052 1727204450.55334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204450.56645: done with get_vars() 34052 1727204450.56670: done getting variables 34052 1727204450.56718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.039) 0:00:36.884 ***** 34052 1727204450.56748: entering _queue_task() for managed-node1/debug 34052 1727204450.57043: worker is 1 (out of 1 available) 34052 1727204450.57061: exiting _queue_task() for managed-node1/debug 34052 1727204450.57075: done queuing things up, now waiting for results queue to drain 34052 1727204450.57077: waiting for pending results... 34052 1727204450.57278: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34052 1727204450.57382: in run() - task 127b8e07-fff9-66a4-e2a3-000000000080 34052 1727204450.57396: variable 'ansible_search_path' from source: unknown 34052 1727204450.57400: variable 'ansible_search_path' from source: unknown 34052 1727204450.57436: calling self._execute() 34052 1727204450.57525: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.57532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.57536: variable 'omit' from source: magic vars 34052 1727204450.57841: variable 'ansible_distribution_major_version' from source: facts 34052 1727204450.57853: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204450.57858: variable 'omit' from source: magic vars 34052 1727204450.57906: variable 'omit' from source: magic vars 34052 1727204450.57936: variable 'omit' from source: magic vars 34052 1727204450.57973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204450.58006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204450.58024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204450.58040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.58050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.58079: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204450.58083: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.58085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.58156: Set connection var ansible_connection to ssh 34052 1727204450.58163: Set connection var ansible_timeout to 10 34052 1727204450.58171: Set connection var ansible_pipelining to False 34052 1727204450.58174: Set connection var ansible_shell_type to sh 34052 1727204450.58185: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204450.58191: Set connection var ansible_shell_executable to /bin/sh 34052 1727204450.58212: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.58216: variable 'ansible_connection' from source: unknown 34052 1727204450.58219: variable 'ansible_module_compression' from source: unknown 34052 1727204450.58221: variable 'ansible_shell_type' from source: unknown 34052 1727204450.58224: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.58229: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.58231: variable 'ansible_pipelining' from source: unknown 34052 1727204450.58233: variable 'ansible_timeout' from source: unknown 34052 1727204450.58236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.58349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204450.58360: variable 'omit' from source: magic vars 34052 1727204450.58367: starting attempt loop 34052 1727204450.58370: running the handler 34052 1727204450.58474: variable '__network_connections_result' from source: set_fact 34052 1727204450.58520: handler run complete 34052 1727204450.58535: attempt loop complete, returning result 34052 1727204450.58538: _execute() done 34052 1727204450.58541: dumping result to json 34052 1727204450.58544: done dumping result, returning 34052 1727204450.58554: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-66a4-e2a3-000000000080] 34052 1727204450.58559: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000080 34052 1727204450.58654: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000080 34052 1727204450.58656: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 34052 1727204450.58730: no more pending results, returning what we have 34052 1727204450.58733: results queue empty 34052 1727204450.58733: checking for any_errors_fatal 34052 1727204450.58741: done checking for any_errors_fatal 34052 1727204450.58742: checking for max_fail_percentage 34052 1727204450.58744: done checking for max_fail_percentage 34052 1727204450.58745: checking to see if all hosts have failed and the running result is not ok 34052 1727204450.58745: done checking to see if all hosts have failed 34052 1727204450.58746: getting the remaining hosts for this loop 34052 1727204450.58748: done getting the remaining hosts for this loop 34052 1727204450.58752: getting the next task for host managed-node1 34052 1727204450.58759: done getting next task for host managed-node1 34052 1727204450.58763: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34052 1727204450.58774: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204450.58787: getting variables 34052 1727204450.58788: in VariableManager get_vars() 34052 1727204450.58833: Calling all_inventory to load vars for managed-node1 34052 1727204450.58836: Calling groups_inventory to load vars for managed-node1 34052 1727204450.58839: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204450.58849: Calling all_plugins_play to load vars for managed-node1 34052 1727204450.58851: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204450.58854: Calling groups_plugins_play to load vars for managed-node1 34052 1727204450.59876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204450.61077: done with get_vars() 34052 1727204450.61106: done getting variables 34052 1727204450.61159: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.044) 0:00:36.929 ***** 34052 1727204450.61191: entering _queue_task() for managed-node1/debug 34052 1727204450.61493: worker is 1 (out of 1 available) 34052 1727204450.61509: exiting _queue_task() for managed-node1/debug 34052 1727204450.61523: done queuing things up, now waiting for results queue to drain 34052 1727204450.61525: waiting for pending results... 34052 1727204450.61731: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34052 1727204450.61833: in run() - task 127b8e07-fff9-66a4-e2a3-000000000081 34052 1727204450.61845: variable 'ansible_search_path' from source: unknown 34052 1727204450.61849: variable 'ansible_search_path' from source: unknown 34052 1727204450.61886: calling self._execute() 34052 1727204450.61969: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.61973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.61986: variable 'omit' from source: magic vars 34052 1727204450.62295: variable 'ansible_distribution_major_version' from source: facts 34052 1727204450.62312: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204450.62315: variable 'omit' from source: magic vars 34052 1727204450.62361: variable 'omit' from source: magic vars 34052 1727204450.62391: variable 'omit' from source: magic vars 34052 1727204450.62433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204450.62462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204450.62482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204450.62496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.62506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.62536: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204450.62539: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.62542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.62615: Set connection var ansible_connection to ssh 34052 1727204450.62623: Set connection var ansible_timeout to 10 34052 1727204450.62629: Set connection var ansible_pipelining to False 34052 1727204450.62638: Set connection var ansible_shell_type to sh 34052 1727204450.62646: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204450.62655: Set connection var ansible_shell_executable to /bin/sh 34052 1727204450.62676: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.62679: variable 'ansible_connection' from source: unknown 34052 1727204450.62683: variable 'ansible_module_compression' from source: unknown 34052 1727204450.62685: variable 'ansible_shell_type' from source: unknown 34052 1727204450.62687: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.62690: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.62695: variable 'ansible_pipelining' from source: unknown 34052 1727204450.62697: variable 'ansible_timeout' from source: unknown 34052 1727204450.62702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.62817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204450.62829: variable 'omit' from source: magic vars 34052 1727204450.62833: starting attempt loop 34052 1727204450.62836: running the handler 34052 1727204450.62880: variable '__network_connections_result' from source: set_fact 34052 1727204450.62944: variable '__network_connections_result' from source: set_fact 34052 1727204450.63029: handler run complete 34052 1727204450.63047: attempt loop complete, returning result 34052 1727204450.63050: _execute() done 34052 1727204450.63053: dumping result to json 34052 1727204450.63056: done dumping result, returning 34052 1727204450.63065: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-66a4-e2a3-000000000081] 34052 1727204450.63068: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000081 34052 1727204450.63173: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000081 34052 1727204450.63175: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 34052 1727204450.63283: no more pending results, returning what we have 34052 1727204450.63287: results queue empty 34052 1727204450.63288: checking for any_errors_fatal 34052 1727204450.63295: done checking for any_errors_fatal 34052 1727204450.63297: checking for max_fail_percentage 34052 1727204450.63299: done checking for max_fail_percentage 34052 1727204450.63300: checking to see if all hosts have failed and the running result is not ok 34052 1727204450.63301: done checking to see if all hosts have failed 34052 1727204450.63301: getting the remaining hosts for this loop 34052 1727204450.63303: done getting the remaining hosts for this loop 34052 1727204450.63307: getting the next task for host managed-node1 34052 1727204450.63314: done getting next task for host managed-node1 34052 1727204450.63318: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34052 1727204450.63321: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204450.63333: getting variables 34052 1727204450.63334: in VariableManager get_vars() 34052 1727204450.63380: Calling all_inventory to load vars for managed-node1 34052 1727204450.63382: Calling groups_inventory to load vars for managed-node1 34052 1727204450.63384: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204450.63394: Calling all_plugins_play to load vars for managed-node1 34052 1727204450.63397: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204450.63400: Calling groups_plugins_play to load vars for managed-node1 34052 1727204450.64569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204450.65752: done with get_vars() 34052 1727204450.65781: done getting variables 34052 1727204450.65833: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.046) 0:00:36.975 ***** 34052 1727204450.65868: entering _queue_task() for managed-node1/debug 34052 1727204450.66159: worker is 1 (out of 1 available) 34052 1727204450.66178: exiting _queue_task() for managed-node1/debug 34052 1727204450.66192: done queuing things up, now waiting for results queue to drain 34052 1727204450.66194: waiting for pending results... 34052 1727204450.66399: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34052 1727204450.66496: in run() - task 127b8e07-fff9-66a4-e2a3-000000000082 34052 1727204450.66510: variable 'ansible_search_path' from source: unknown 34052 1727204450.66514: variable 'ansible_search_path' from source: unknown 34052 1727204450.66549: calling self._execute() 34052 1727204450.66636: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.66640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.66652: variable 'omit' from source: magic vars 34052 1727204450.66959: variable 'ansible_distribution_major_version' from source: facts 34052 1727204450.66972: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204450.67068: variable 'network_state' from source: role '' defaults 34052 1727204450.67078: Evaluated conditional (network_state != {}): False 34052 1727204450.67082: when evaluation is False, skipping this task 34052 1727204450.67084: _execute() done 34052 1727204450.67088: dumping result to json 34052 1727204450.67091: done dumping result, returning 34052 1727204450.67102: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-66a4-e2a3-000000000082] 34052 1727204450.67105: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000082 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 34052 1727204450.67256: no more pending results, returning what we have 34052 1727204450.67260: results queue empty 34052 1727204450.67261: checking for any_errors_fatal 34052 1727204450.67273: done checking for any_errors_fatal 34052 1727204450.67274: checking for max_fail_percentage 34052 1727204450.67276: done checking for max_fail_percentage 34052 1727204450.67277: checking to see if all hosts have failed and the running result is not ok 34052 1727204450.67278: done checking to see if all hosts have failed 34052 1727204450.67278: getting the remaining hosts for this loop 34052 1727204450.67280: done getting the remaining hosts for this loop 34052 1727204450.67285: getting the next task for host managed-node1 34052 1727204450.67291: done getting next task for host managed-node1 34052 1727204450.67296: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34052 1727204450.67299: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204450.67321: getting variables 34052 1727204450.67322: in VariableManager get_vars() 34052 1727204450.67364: Calling all_inventory to load vars for managed-node1 34052 1727204450.67375: Calling groups_inventory to load vars for managed-node1 34052 1727204450.67378: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204450.67383: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000082 34052 1727204450.67386: WORKER PROCESS EXITING 34052 1727204450.67396: Calling all_plugins_play to load vars for managed-node1 34052 1727204450.67399: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204450.67402: Calling groups_plugins_play to load vars for managed-node1 34052 1727204450.68420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204450.69729: done with get_vars() 34052 1727204450.69751: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.039) 0:00:37.015 ***** 34052 1727204450.69834: entering _queue_task() for managed-node1/ping 34052 1727204450.70133: worker is 1 (out of 1 available) 34052 1727204450.70149: exiting _queue_task() for managed-node1/ping 34052 1727204450.70163: done queuing things up, now waiting for results queue to drain 34052 1727204450.70167: waiting for pending results... 34052 1727204450.70361: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34052 1727204450.70460: in run() - task 127b8e07-fff9-66a4-e2a3-000000000083 34052 1727204450.70476: variable 'ansible_search_path' from source: unknown 34052 1727204450.70479: variable 'ansible_search_path' from source: unknown 34052 1727204450.70514: calling self._execute() 34052 1727204450.70599: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.70603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.70613: variable 'omit' from source: magic vars 34052 1727204450.70929: variable 'ansible_distribution_major_version' from source: facts 34052 1727204450.70938: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204450.70948: variable 'omit' from source: magic vars 34052 1727204450.70996: variable 'omit' from source: magic vars 34052 1727204450.71029: variable 'omit' from source: magic vars 34052 1727204450.71066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204450.71104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204450.71121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204450.71137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.71148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204450.71179: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204450.71183: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.71185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.71261: Set connection var ansible_connection to ssh 34052 1727204450.71271: Set connection var ansible_timeout to 10 34052 1727204450.71283: Set connection var ansible_pipelining to False 34052 1727204450.71286: Set connection var ansible_shell_type to sh 34052 1727204450.71289: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204450.71297: Set connection var ansible_shell_executable to /bin/sh 34052 1727204450.71319: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.71322: variable 'ansible_connection' from source: unknown 34052 1727204450.71328: variable 'ansible_module_compression' from source: unknown 34052 1727204450.71330: variable 'ansible_shell_type' from source: unknown 34052 1727204450.71333: variable 'ansible_shell_executable' from source: unknown 34052 1727204450.71335: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204450.71337: variable 'ansible_pipelining' from source: unknown 34052 1727204450.71340: variable 'ansible_timeout' from source: unknown 34052 1727204450.71344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204450.71517: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34052 1727204450.71529: variable 'omit' from source: magic vars 34052 1727204450.71532: starting attempt loop 34052 1727204450.71534: running the handler 34052 1727204450.71548: _low_level_execute_command(): starting 34052 1727204450.71555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204450.72124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.72130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.72133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.72193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.72196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.72199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.72257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.74027: stdout chunk (state=3): >>>/root <<< 34052 1727204450.74131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.74200: stderr chunk (state=3): >>><<< 34052 1727204450.74203: stdout chunk (state=3): >>><<< 34052 1727204450.74228: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204450.74237: _low_level_execute_command(): starting 34052 1727204450.74245: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179 `" && echo ansible-tmp-1727204450.7422512-36449-198990858665179="` echo /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179 `" ) && sleep 0' 34052 1727204450.74742: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.74750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204450.74761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.74810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.74813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.74819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.74872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.76925: stdout chunk (state=3): >>>ansible-tmp-1727204450.7422512-36449-198990858665179=/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179 <<< 34052 1727204450.77041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.77104: stderr chunk (state=3): >>><<< 34052 1727204450.77107: stdout chunk (state=3): >>><<< 34052 1727204450.77125: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204450.7422512-36449-198990858665179=/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204450.77176: variable 'ansible_module_compression' from source: unknown 34052 1727204450.77213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 34052 1727204450.77246: variable 'ansible_facts' from source: unknown 34052 1727204450.77304: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py 34052 1727204450.77420: Sending initial data 34052 1727204450.77423: Sent initial data (153 bytes) 34052 1727204450.77928: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.77932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.77935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204450.77937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.77990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.77994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.78000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.78057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.79699: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204450.79744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204450.79793: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpfvwncd60 /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py <<< 34052 1727204450.79797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py" <<< 34052 1727204450.79847: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpfvwncd60" to remote "/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py" <<< 34052 1727204450.79850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py" <<< 34052 1727204450.80436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.80507: stderr chunk (state=3): >>><<< 34052 1727204450.80511: stdout chunk (state=3): >>><<< 34052 1727204450.80536: done transferring module to remote 34052 1727204450.80546: _low_level_execute_command(): starting 34052 1727204450.80551: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/ /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py && sleep 0' 34052 1727204450.81056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.81060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.81062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204450.81067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204450.81070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.81124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.81128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.81130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.81188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204450.83097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204450.83161: stderr chunk (state=3): >>><<< 34052 1727204450.83167: stdout chunk (state=3): >>><<< 34052 1727204450.83180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204450.83183: _low_level_execute_command(): starting 34052 1727204450.83189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/AnsiballZ_ping.py && sleep 0' 34052 1727204450.83706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.83710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.83713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204450.83715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204450.83770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204450.83774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204450.83780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204450.83841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.00732: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34052 1727204451.02060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204451.02117: stderr chunk (state=3): >>><<< 34052 1727204451.02121: stdout chunk (state=3): >>><<< 34052 1727204451.02140: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204451.02161: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204451.02172: _low_level_execute_command(): starting 34052 1727204451.02177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204450.7422512-36449-198990858665179/ > /dev/null 2>&1 && sleep 0' 34052 1727204451.02653: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204451.02685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.02688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204451.02691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.02739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204451.02743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204451.02752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.02810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.04754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204451.04819: stderr chunk (state=3): >>><<< 34052 1727204451.04823: stdout chunk (state=3): >>><<< 34052 1727204451.04842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204451.04849: handler run complete 34052 1727204451.04864: attempt loop complete, returning result 34052 1727204451.04869: _execute() done 34052 1727204451.04871: dumping result to json 34052 1727204451.04874: done dumping result, returning 34052 1727204451.04885: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-66a4-e2a3-000000000083] 34052 1727204451.04888: sending task result for task 127b8e07-fff9-66a4-e2a3-000000000083 34052 1727204451.04985: done sending task result for task 127b8e07-fff9-66a4-e2a3-000000000083 34052 1727204451.04988: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 34052 1727204451.05055: no more pending results, returning what we have 34052 1727204451.05058: results queue empty 34052 1727204451.05059: checking for any_errors_fatal 34052 1727204451.05069: done checking for any_errors_fatal 34052 1727204451.05070: checking for max_fail_percentage 34052 1727204451.05072: done checking for max_fail_percentage 34052 1727204451.05072: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.05073: done checking to see if all hosts have failed 34052 1727204451.05074: getting the remaining hosts for this loop 34052 1727204451.05076: done getting the remaining hosts for this loop 34052 1727204451.05080: getting the next task for host managed-node1 34052 1727204451.05089: done getting next task for host managed-node1 34052 1727204451.05092: ^ task is: TASK: meta (role_complete) 34052 1727204451.05095: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.05109: getting variables 34052 1727204451.05110: in VariableManager get_vars() 34052 1727204451.05153: Calling all_inventory to load vars for managed-node1 34052 1727204451.05156: Calling groups_inventory to load vars for managed-node1 34052 1727204451.05158: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.05175: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.05179: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.05182: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.06200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.08311: done with get_vars() 34052 1727204451.08356: done getting variables 34052 1727204451.08454: done queuing things up, now waiting for results queue to drain 34052 1727204451.08456: results queue empty 34052 1727204451.08457: checking for any_errors_fatal 34052 1727204451.08460: done checking for any_errors_fatal 34052 1727204451.08461: checking for max_fail_percentage 34052 1727204451.08463: done checking for max_fail_percentage 34052 1727204451.08463: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.08464: done checking to see if all hosts have failed 34052 1727204451.08467: getting the remaining hosts for this loop 34052 1727204451.08469: done getting the remaining hosts for this loop 34052 1727204451.08472: getting the next task for host managed-node1 34052 1727204451.08476: done getting next task for host managed-node1 34052 1727204451.08479: ^ task is: TASK: Include the task 'manage_test_interface.yml' 34052 1727204451.08481: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204451.08484: getting variables 34052 1727204451.08485: in VariableManager get_vars() 34052 1727204451.08502: Calling all_inventory to load vars for managed-node1 34052 1727204451.08504: Calling groups_inventory to load vars for managed-node1 34052 1727204451.08506: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.08512: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.08514: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.08517: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.10013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.11201: done with get_vars() 34052 1727204451.11231: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.414) 0:00:37.430 ***** 34052 1727204451.11301: entering _queue_task() for managed-node1/include_tasks 34052 1727204451.11656: worker is 1 (out of 1 available) 34052 1727204451.11675: exiting _queue_task() for managed-node1/include_tasks 34052 1727204451.11691: done queuing things up, now waiting for results queue to drain 34052 1727204451.11693: waiting for pending results... 34052 1727204451.12204: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 34052 1727204451.12211: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000b3 34052 1727204451.12215: variable 'ansible_search_path' from source: unknown 34052 1727204451.12222: calling self._execute() 34052 1727204451.12294: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.12307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.12323: variable 'omit' from source: magic vars 34052 1727204451.12785: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.12804: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.12823: _execute() done 34052 1727204451.12834: dumping result to json 34052 1727204451.12841: done dumping result, returning 34052 1727204451.12852: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-66a4-e2a3-0000000000b3] 34052 1727204451.12860: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b3 34052 1727204451.13021: no more pending results, returning what we have 34052 1727204451.13032: in VariableManager get_vars() 34052 1727204451.13087: Calling all_inventory to load vars for managed-node1 34052 1727204451.13090: Calling groups_inventory to load vars for managed-node1 34052 1727204451.13092: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.13110: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.13113: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.13116: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.13654: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b3 34052 1727204451.13658: WORKER PROCESS EXITING 34052 1727204451.15376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.17745: done with get_vars() 34052 1727204451.17782: variable 'ansible_search_path' from source: unknown 34052 1727204451.17801: we have included files to process 34052 1727204451.17802: generating all_blocks data 34052 1727204451.17805: done generating all_blocks data 34052 1727204451.17810: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204451.17817: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204451.17821: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34052 1727204451.18294: in VariableManager get_vars() 34052 1727204451.18321: done with get_vars() 34052 1727204451.19058: done processing included file 34052 1727204451.19060: iterating over new_blocks loaded from include file 34052 1727204451.19062: in VariableManager get_vars() 34052 1727204451.19088: done with get_vars() 34052 1727204451.19091: filtering new block on tags 34052 1727204451.19135: done filtering new block on tags 34052 1727204451.19139: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 34052 1727204451.19145: extending task lists for all hosts with included blocks 34052 1727204451.21875: done extending task lists 34052 1727204451.21877: done processing included files 34052 1727204451.21878: results queue empty 34052 1727204451.21878: checking for any_errors_fatal 34052 1727204451.21881: done checking for any_errors_fatal 34052 1727204451.21881: checking for max_fail_percentage 34052 1727204451.21883: done checking for max_fail_percentage 34052 1727204451.21883: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.21884: done checking to see if all hosts have failed 34052 1727204451.21885: getting the remaining hosts for this loop 34052 1727204451.21887: done getting the remaining hosts for this loop 34052 1727204451.21889: getting the next task for host managed-node1 34052 1727204451.21893: done getting next task for host managed-node1 34052 1727204451.21896: ^ task is: TASK: Ensure state in ["present", "absent"] 34052 1727204451.21899: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.21902: getting variables 34052 1727204451.21903: in VariableManager get_vars() 34052 1727204451.21925: Calling all_inventory to load vars for managed-node1 34052 1727204451.21928: Calling groups_inventory to load vars for managed-node1 34052 1727204451.21930: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.21937: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.21939: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.21943: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.23590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.25486: done with get_vars() 34052 1727204451.25517: done getting variables 34052 1727204451.25556: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.142) 0:00:37.573 ***** 34052 1727204451.25583: entering _queue_task() for managed-node1/fail 34052 1727204451.25884: worker is 1 (out of 1 available) 34052 1727204451.25900: exiting _queue_task() for managed-node1/fail 34052 1727204451.25914: done queuing things up, now waiting for results queue to drain 34052 1727204451.25915: waiting for pending results... 34052 1727204451.26117: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 34052 1727204451.26199: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005cc 34052 1727204451.26213: variable 'ansible_search_path' from source: unknown 34052 1727204451.26216: variable 'ansible_search_path' from source: unknown 34052 1727204451.26254: calling self._execute() 34052 1727204451.26340: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.26345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.26356: variable 'omit' from source: magic vars 34052 1727204451.26660: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.26672: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.26775: variable 'state' from source: include params 34052 1727204451.26781: Evaluated conditional (state not in ["present", "absent"]): False 34052 1727204451.26784: when evaluation is False, skipping this task 34052 1727204451.26788: _execute() done 34052 1727204451.26791: dumping result to json 34052 1727204451.26795: done dumping result, returning 34052 1727204451.26803: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-66a4-e2a3-0000000005cc] 34052 1727204451.26807: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cc 34052 1727204451.26906: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cc 34052 1727204451.26914: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 34052 1727204451.26967: no more pending results, returning what we have 34052 1727204451.26971: results queue empty 34052 1727204451.26972: checking for any_errors_fatal 34052 1727204451.26974: done checking for any_errors_fatal 34052 1727204451.26975: checking for max_fail_percentage 34052 1727204451.26976: done checking for max_fail_percentage 34052 1727204451.26977: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.26978: done checking to see if all hosts have failed 34052 1727204451.26979: getting the remaining hosts for this loop 34052 1727204451.26981: done getting the remaining hosts for this loop 34052 1727204451.26985: getting the next task for host managed-node1 34052 1727204451.26995: done getting next task for host managed-node1 34052 1727204451.26997: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 34052 1727204451.27004: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.27009: getting variables 34052 1727204451.27011: in VariableManager get_vars() 34052 1727204451.27057: Calling all_inventory to load vars for managed-node1 34052 1727204451.27060: Calling groups_inventory to load vars for managed-node1 34052 1727204451.27062: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.27081: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.27086: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.27090: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.33835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.36028: done with get_vars() 34052 1727204451.36072: done getting variables 34052 1727204451.36130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.105) 0:00:37.678 ***** 34052 1727204451.36160: entering _queue_task() for managed-node1/fail 34052 1727204451.36568: worker is 1 (out of 1 available) 34052 1727204451.36583: exiting _queue_task() for managed-node1/fail 34052 1727204451.36597: done queuing things up, now waiting for results queue to drain 34052 1727204451.36600: waiting for pending results... 34052 1727204451.36890: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 34052 1727204451.36983: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005cd 34052 1727204451.37007: variable 'ansible_search_path' from source: unknown 34052 1727204451.37011: variable 'ansible_search_path' from source: unknown 34052 1727204451.37053: calling self._execute() 34052 1727204451.37174: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.37182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.37273: variable 'omit' from source: magic vars 34052 1727204451.37673: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.37692: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.37870: variable 'type' from source: play vars 34052 1727204451.37878: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 34052 1727204451.37881: when evaluation is False, skipping this task 34052 1727204451.37885: _execute() done 34052 1727204451.37888: dumping result to json 34052 1727204451.37890: done dumping result, returning 34052 1727204451.37898: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-66a4-e2a3-0000000005cd] 34052 1727204451.37904: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cd 34052 1727204451.38017: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cd 34052 1727204451.38019: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 34052 1727204451.38073: no more pending results, returning what we have 34052 1727204451.38076: results queue empty 34052 1727204451.38077: checking for any_errors_fatal 34052 1727204451.38088: done checking for any_errors_fatal 34052 1727204451.38089: checking for max_fail_percentage 34052 1727204451.38091: done checking for max_fail_percentage 34052 1727204451.38092: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.38092: done checking to see if all hosts have failed 34052 1727204451.38093: getting the remaining hosts for this loop 34052 1727204451.38095: done getting the remaining hosts for this loop 34052 1727204451.38100: getting the next task for host managed-node1 34052 1727204451.38106: done getting next task for host managed-node1 34052 1727204451.38109: ^ task is: TASK: Include the task 'show_interfaces.yml' 34052 1727204451.38112: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.38118: getting variables 34052 1727204451.38119: in VariableManager get_vars() 34052 1727204451.38169: Calling all_inventory to load vars for managed-node1 34052 1727204451.38172: Calling groups_inventory to load vars for managed-node1 34052 1727204451.38175: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.38187: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.38191: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.38194: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.40295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.42722: done with get_vars() 34052 1727204451.42768: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.067) 0:00:37.746 ***** 34052 1727204451.42891: entering _queue_task() for managed-node1/include_tasks 34052 1727204451.43309: worker is 1 (out of 1 available) 34052 1727204451.43439: exiting _queue_task() for managed-node1/include_tasks 34052 1727204451.43451: done queuing things up, now waiting for results queue to drain 34052 1727204451.43453: waiting for pending results... 34052 1727204451.43783: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 34052 1727204451.43973: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005ce 34052 1727204451.43978: variable 'ansible_search_path' from source: unknown 34052 1727204451.43982: variable 'ansible_search_path' from source: unknown 34052 1727204451.43986: calling self._execute() 34052 1727204451.44039: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.44045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.44057: variable 'omit' from source: magic vars 34052 1727204451.44547: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.44564: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.44572: _execute() done 34052 1727204451.44576: dumping result to json 34052 1727204451.44578: done dumping result, returning 34052 1727204451.44586: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-66a4-e2a3-0000000005ce] 34052 1727204451.44592: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005ce 34052 1727204451.44706: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005ce 34052 1727204451.44710: WORKER PROCESS EXITING 34052 1727204451.44746: no more pending results, returning what we have 34052 1727204451.44753: in VariableManager get_vars() 34052 1727204451.45010: Calling all_inventory to load vars for managed-node1 34052 1727204451.45013: Calling groups_inventory to load vars for managed-node1 34052 1727204451.45015: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.45030: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.45033: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.45036: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.47005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.49417: done with get_vars() 34052 1727204451.49465: variable 'ansible_search_path' from source: unknown 34052 1727204451.49468: variable 'ansible_search_path' from source: unknown 34052 1727204451.49514: we have included files to process 34052 1727204451.49515: generating all_blocks data 34052 1727204451.49517: done generating all_blocks data 34052 1727204451.49523: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204451.49524: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204451.49537: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34052 1727204451.49674: in VariableManager get_vars() 34052 1727204451.49703: done with get_vars() 34052 1727204451.49839: done processing included file 34052 1727204451.49841: iterating over new_blocks loaded from include file 34052 1727204451.49843: in VariableManager get_vars() 34052 1727204451.49873: done with get_vars() 34052 1727204451.49875: filtering new block on tags 34052 1727204451.49896: done filtering new block on tags 34052 1727204451.49898: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 34052 1727204451.49904: extending task lists for all hosts with included blocks 34052 1727204451.50461: done extending task lists 34052 1727204451.50462: done processing included files 34052 1727204451.50463: results queue empty 34052 1727204451.50464: checking for any_errors_fatal 34052 1727204451.50470: done checking for any_errors_fatal 34052 1727204451.50471: checking for max_fail_percentage 34052 1727204451.50472: done checking for max_fail_percentage 34052 1727204451.50473: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.50474: done checking to see if all hosts have failed 34052 1727204451.50475: getting the remaining hosts for this loop 34052 1727204451.50476: done getting the remaining hosts for this loop 34052 1727204451.50478: getting the next task for host managed-node1 34052 1727204451.50483: done getting next task for host managed-node1 34052 1727204451.50485: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34052 1727204451.50488: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.50492: getting variables 34052 1727204451.50493: in VariableManager get_vars() 34052 1727204451.50516: Calling all_inventory to load vars for managed-node1 34052 1727204451.50518: Calling groups_inventory to load vars for managed-node1 34052 1727204451.50521: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.50530: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.50533: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.50536: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.53270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.55469: done with get_vars() 34052 1727204451.55508: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.127) 0:00:37.873 ***** 34052 1727204451.55604: entering _queue_task() for managed-node1/include_tasks 34052 1727204451.56032: worker is 1 (out of 1 available) 34052 1727204451.56046: exiting _queue_task() for managed-node1/include_tasks 34052 1727204451.56174: done queuing things up, now waiting for results queue to drain 34052 1727204451.56177: waiting for pending results... 34052 1727204451.56785: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 34052 1727204451.56791: in run() - task 127b8e07-fff9-66a4-e2a3-0000000006e4 34052 1727204451.56795: variable 'ansible_search_path' from source: unknown 34052 1727204451.56798: variable 'ansible_search_path' from source: unknown 34052 1727204451.56801: calling self._execute() 34052 1727204451.56804: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.56807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.56812: variable 'omit' from source: magic vars 34052 1727204451.57203: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.57214: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.57222: _execute() done 34052 1727204451.57226: dumping result to json 34052 1727204451.57228: done dumping result, returning 34052 1727204451.57238: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-66a4-e2a3-0000000006e4] 34052 1727204451.57243: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000006e4 34052 1727204451.57352: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000006e4 34052 1727204451.57355: WORKER PROCESS EXITING 34052 1727204451.57396: no more pending results, returning what we have 34052 1727204451.57403: in VariableManager get_vars() 34052 1727204451.57462: Calling all_inventory to load vars for managed-node1 34052 1727204451.57468: Calling groups_inventory to load vars for managed-node1 34052 1727204451.57470: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.57489: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.57493: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.57497: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.59886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.62349: done with get_vars() 34052 1727204451.62504: variable 'ansible_search_path' from source: unknown 34052 1727204451.62506: variable 'ansible_search_path' from source: unknown 34052 1727204451.62691: we have included files to process 34052 1727204451.62692: generating all_blocks data 34052 1727204451.62694: done generating all_blocks data 34052 1727204451.62695: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204451.62697: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204451.62699: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34052 1727204451.63353: done processing included file 34052 1727204451.63355: iterating over new_blocks loaded from include file 34052 1727204451.63357: in VariableManager get_vars() 34052 1727204451.63454: done with get_vars() 34052 1727204451.63456: filtering new block on tags 34052 1727204451.63482: done filtering new block on tags 34052 1727204451.63485: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 34052 1727204451.63491: extending task lists for all hosts with included blocks 34052 1727204451.63924: done extending task lists 34052 1727204451.63928: done processing included files 34052 1727204451.63929: results queue empty 34052 1727204451.63929: checking for any_errors_fatal 34052 1727204451.63933: done checking for any_errors_fatal 34052 1727204451.63933: checking for max_fail_percentage 34052 1727204451.63935: done checking for max_fail_percentage 34052 1727204451.63935: checking to see if all hosts have failed and the running result is not ok 34052 1727204451.63936: done checking to see if all hosts have failed 34052 1727204451.63937: getting the remaining hosts for this loop 34052 1727204451.63938: done getting the remaining hosts for this loop 34052 1727204451.63941: getting the next task for host managed-node1 34052 1727204451.63946: done getting next task for host managed-node1 34052 1727204451.63949: ^ task is: TASK: Gather current interface info 34052 1727204451.63953: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204451.63955: getting variables 34052 1727204451.63956: in VariableManager get_vars() 34052 1727204451.64089: Calling all_inventory to load vars for managed-node1 34052 1727204451.64093: Calling groups_inventory to load vars for managed-node1 34052 1727204451.64096: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204451.64103: Calling all_plugins_play to load vars for managed-node1 34052 1727204451.64106: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204451.64109: Calling groups_plugins_play to load vars for managed-node1 34052 1727204451.65991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204451.68463: done with get_vars() 34052 1727204451.68498: done getting variables 34052 1727204451.68555: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.129) 0:00:38.003 ***** 34052 1727204451.68602: entering _queue_task() for managed-node1/command 34052 1727204451.69170: worker is 1 (out of 1 available) 34052 1727204451.69185: exiting _queue_task() for managed-node1/command 34052 1727204451.69198: done queuing things up, now waiting for results queue to drain 34052 1727204451.69200: waiting for pending results... 34052 1727204451.69797: running TaskExecutor() for managed-node1/TASK: Gather current interface info 34052 1727204451.69952: in run() - task 127b8e07-fff9-66a4-e2a3-00000000071b 34052 1727204451.69957: variable 'ansible_search_path' from source: unknown 34052 1727204451.69960: variable 'ansible_search_path' from source: unknown 34052 1727204451.69963: calling self._execute() 34052 1727204451.69968: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.69972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.69975: variable 'omit' from source: magic vars 34052 1727204451.70284: variable 'ansible_distribution_major_version' from source: facts 34052 1727204451.70294: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204451.70302: variable 'omit' from source: magic vars 34052 1727204451.70378: variable 'omit' from source: magic vars 34052 1727204451.70420: variable 'omit' from source: magic vars 34052 1727204451.70478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204451.70520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204451.70552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204451.70575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204451.70590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204451.70623: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204451.70627: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.70634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.70743: Set connection var ansible_connection to ssh 34052 1727204451.70876: Set connection var ansible_timeout to 10 34052 1727204451.70881: Set connection var ansible_pipelining to False 34052 1727204451.70884: Set connection var ansible_shell_type to sh 34052 1727204451.70886: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204451.70889: Set connection var ansible_shell_executable to /bin/sh 34052 1727204451.70893: variable 'ansible_shell_executable' from source: unknown 34052 1727204451.70895: variable 'ansible_connection' from source: unknown 34052 1727204451.70899: variable 'ansible_module_compression' from source: unknown 34052 1727204451.70902: variable 'ansible_shell_type' from source: unknown 34052 1727204451.70906: variable 'ansible_shell_executable' from source: unknown 34052 1727204451.70909: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204451.70911: variable 'ansible_pipelining' from source: unknown 34052 1727204451.70914: variable 'ansible_timeout' from source: unknown 34052 1727204451.70917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204451.71172: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204451.71176: variable 'omit' from source: magic vars 34052 1727204451.71179: starting attempt loop 34052 1727204451.71181: running the handler 34052 1727204451.71184: _low_level_execute_command(): starting 34052 1727204451.71186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204451.71991: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204451.72031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.72104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204451.72137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.72306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.74220: stdout chunk (state=3): >>>/root <<< 34052 1727204451.74224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204451.74269: stderr chunk (state=3): >>><<< 34052 1727204451.74273: stdout chunk (state=3): >>><<< 34052 1727204451.74334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204451.74349: _low_level_execute_command(): starting 34052 1727204451.74358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490 `" && echo ansible-tmp-1727204451.7433379-36475-178001654708490="` echo /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490 `" ) && sleep 0' 34052 1727204451.75755: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204451.75759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.75847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204451.75860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.75937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204451.75941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204451.76100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.76339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.78438: stdout chunk (state=3): >>>ansible-tmp-1727204451.7433379-36475-178001654708490=/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490 <<< 34052 1727204451.78592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204451.78678: stderr chunk (state=3): >>><<< 34052 1727204451.78871: stdout chunk (state=3): >>><<< 34052 1727204451.78875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204451.7433379-36475-178001654708490=/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204451.78878: variable 'ansible_module_compression' from source: unknown 34052 1727204451.78880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204451.78882: variable 'ansible_facts' from source: unknown 34052 1727204451.78945: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py 34052 1727204451.79129: Sending initial data 34052 1727204451.79140: Sent initial data (156 bytes) 34052 1727204451.79992: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.80115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.80197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.81878: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34052 1727204451.81922: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204451.81961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204451.82051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpama4nzyd /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py <<< 34052 1727204451.82055: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py" <<< 34052 1727204451.82098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpama4nzyd" to remote "/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py" <<< 34052 1727204451.82997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204451.83176: stderr chunk (state=3): >>><<< 34052 1727204451.83181: stdout chunk (state=3): >>><<< 34052 1727204451.83183: done transferring module to remote 34052 1727204451.83186: _low_level_execute_command(): starting 34052 1727204451.83188: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/ /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py && sleep 0' 34052 1727204451.84306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204451.84312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204451.84416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204451.84425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204451.84571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.84617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204451.86573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204451.86691: stderr chunk (state=3): >>><<< 34052 1727204451.86694: stdout chunk (state=3): >>><<< 34052 1727204451.86709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204451.86807: _low_level_execute_command(): starting 34052 1727204451.86811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/AnsiballZ_command.py && sleep 0' 34052 1727204451.87446: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204451.87467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204451.87592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204451.87617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204451.87715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.05132: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:52.046730", "end": "2024-09-24 15:00:52.050332", "delta": "0:00:00.003602", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204452.06810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204452.06875: stderr chunk (state=3): >>><<< 34052 1727204452.06880: stdout chunk (state=3): >>><<< 34052 1727204452.06898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:52.046730", "end": "2024-09-24 15:00:52.050332", "delta": "0:00:00.003602", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204452.06930: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204452.06939: _low_level_execute_command(): starting 34052 1727204452.06944: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204451.7433379-36475-178001654708490/ > /dev/null 2>&1 && sleep 0' 34052 1727204452.07441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.07445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.07448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.07450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.07508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.07520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204452.07522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.07569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.09523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204452.09591: stderr chunk (state=3): >>><<< 34052 1727204452.09595: stdout chunk (state=3): >>><<< 34052 1727204452.09610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204452.09617: handler run complete 34052 1727204452.09639: Evaluated conditional (False): False 34052 1727204452.09649: attempt loop complete, returning result 34052 1727204452.09652: _execute() done 34052 1727204452.09654: dumping result to json 34052 1727204452.09666: done dumping result, returning 34052 1727204452.09672: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [127b8e07-fff9-66a4-e2a3-00000000071b] 34052 1727204452.09676: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000071b 34052 1727204452.09789: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000071b 34052 1727204452.09792: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003602", "end": "2024-09-24 15:00:52.050332", "rc": 0, "start": "2024-09-24 15:00:52.046730" } STDOUT: bonding_masters eth0 lo veth0 34052 1727204452.09881: no more pending results, returning what we have 34052 1727204452.09884: results queue empty 34052 1727204452.09885: checking for any_errors_fatal 34052 1727204452.09887: done checking for any_errors_fatal 34052 1727204452.09887: checking for max_fail_percentage 34052 1727204452.09889: done checking for max_fail_percentage 34052 1727204452.09890: checking to see if all hosts have failed and the running result is not ok 34052 1727204452.09891: done checking to see if all hosts have failed 34052 1727204452.09892: getting the remaining hosts for this loop 34052 1727204452.09893: done getting the remaining hosts for this loop 34052 1727204452.09898: getting the next task for host managed-node1 34052 1727204452.09911: done getting next task for host managed-node1 34052 1727204452.09915: ^ task is: TASK: Set current_interfaces 34052 1727204452.09920: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204452.09925: getting variables 34052 1727204452.09929: in VariableManager get_vars() 34052 1727204452.09972: Calling all_inventory to load vars for managed-node1 34052 1727204452.09975: Calling groups_inventory to load vars for managed-node1 34052 1727204452.09977: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204452.09989: Calling all_plugins_play to load vars for managed-node1 34052 1727204452.09992: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204452.09995: Calling groups_plugins_play to load vars for managed-node1 34052 1727204452.11042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204452.12873: done with get_vars() 34052 1727204452.12905: done getting variables 34052 1727204452.12957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.443) 0:00:38.447 ***** 34052 1727204452.12987: entering _queue_task() for managed-node1/set_fact 34052 1727204452.13288: worker is 1 (out of 1 available) 34052 1727204452.13304: exiting _queue_task() for managed-node1/set_fact 34052 1727204452.13318: done queuing things up, now waiting for results queue to drain 34052 1727204452.13319: waiting for pending results... 34052 1727204452.13523: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 34052 1727204452.13606: in run() - task 127b8e07-fff9-66a4-e2a3-00000000071c 34052 1727204452.13625: variable 'ansible_search_path' from source: unknown 34052 1727204452.13629: variable 'ansible_search_path' from source: unknown 34052 1727204452.13665: calling self._execute() 34052 1727204452.13752: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.13756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.13771: variable 'omit' from source: magic vars 34052 1727204452.14096: variable 'ansible_distribution_major_version' from source: facts 34052 1727204452.14111: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204452.14114: variable 'omit' from source: magic vars 34052 1727204452.14161: variable 'omit' from source: magic vars 34052 1727204452.14248: variable '_current_interfaces' from source: set_fact 34052 1727204452.14300: variable 'omit' from source: magic vars 34052 1727204452.14338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204452.14371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204452.14391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204452.14405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.14421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.14447: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204452.14450: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.14453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.14525: Set connection var ansible_connection to ssh 34052 1727204452.14537: Set connection var ansible_timeout to 10 34052 1727204452.14542: Set connection var ansible_pipelining to False 34052 1727204452.14545: Set connection var ansible_shell_type to sh 34052 1727204452.14556: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204452.14562: Set connection var ansible_shell_executable to /bin/sh 34052 1727204452.14584: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.14588: variable 'ansible_connection' from source: unknown 34052 1727204452.14591: variable 'ansible_module_compression' from source: unknown 34052 1727204452.14594: variable 'ansible_shell_type' from source: unknown 34052 1727204452.14596: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.14599: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.14601: variable 'ansible_pipelining' from source: unknown 34052 1727204452.14604: variable 'ansible_timeout' from source: unknown 34052 1727204452.14606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.14722: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204452.14733: variable 'omit' from source: magic vars 34052 1727204452.14742: starting attempt loop 34052 1727204452.14745: running the handler 34052 1727204452.14756: handler run complete 34052 1727204452.14767: attempt loop complete, returning result 34052 1727204452.14770: _execute() done 34052 1727204452.14775: dumping result to json 34052 1727204452.14777: done dumping result, returning 34052 1727204452.14780: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [127b8e07-fff9-66a4-e2a3-00000000071c] 34052 1727204452.14785: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000071c 34052 1727204452.14874: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000071c 34052 1727204452.14876: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 34052 1727204452.14936: no more pending results, returning what we have 34052 1727204452.14939: results queue empty 34052 1727204452.14940: checking for any_errors_fatal 34052 1727204452.14953: done checking for any_errors_fatal 34052 1727204452.14954: checking for max_fail_percentage 34052 1727204452.14955: done checking for max_fail_percentage 34052 1727204452.14956: checking to see if all hosts have failed and the running result is not ok 34052 1727204452.14957: done checking to see if all hosts have failed 34052 1727204452.14958: getting the remaining hosts for this loop 34052 1727204452.14959: done getting the remaining hosts for this loop 34052 1727204452.14964: getting the next task for host managed-node1 34052 1727204452.14975: done getting next task for host managed-node1 34052 1727204452.14977: ^ task is: TASK: Show current_interfaces 34052 1727204452.14982: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204452.14987: getting variables 34052 1727204452.14989: in VariableManager get_vars() 34052 1727204452.15032: Calling all_inventory to load vars for managed-node1 34052 1727204452.15035: Calling groups_inventory to load vars for managed-node1 34052 1727204452.15037: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204452.15048: Calling all_plugins_play to load vars for managed-node1 34052 1727204452.15051: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204452.15054: Calling groups_plugins_play to load vars for managed-node1 34052 1727204452.16200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204452.17610: done with get_vars() 34052 1727204452.17649: done getting variables 34052 1727204452.17719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.047) 0:00:38.494 ***** 34052 1727204452.17755: entering _queue_task() for managed-node1/debug 34052 1727204452.18147: worker is 1 (out of 1 available) 34052 1727204452.18161: exiting _queue_task() for managed-node1/debug 34052 1727204452.18176: done queuing things up, now waiting for results queue to drain 34052 1727204452.18178: waiting for pending results... 34052 1727204452.18589: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 34052 1727204452.18625: in run() - task 127b8e07-fff9-66a4-e2a3-0000000006e5 34052 1727204452.18650: variable 'ansible_search_path' from source: unknown 34052 1727204452.18659: variable 'ansible_search_path' from source: unknown 34052 1727204452.18711: calling self._execute() 34052 1727204452.18830: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.18844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.18860: variable 'omit' from source: magic vars 34052 1727204452.19292: variable 'ansible_distribution_major_version' from source: facts 34052 1727204452.19312: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204452.19325: variable 'omit' from source: magic vars 34052 1727204452.19386: variable 'omit' from source: magic vars 34052 1727204452.19509: variable 'current_interfaces' from source: set_fact 34052 1727204452.19544: variable 'omit' from source: magic vars 34052 1727204452.19599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204452.19647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204452.19770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204452.19773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.19778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.19780: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204452.19782: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.19784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.19883: Set connection var ansible_connection to ssh 34052 1727204452.19897: Set connection var ansible_timeout to 10 34052 1727204452.19910: Set connection var ansible_pipelining to False 34052 1727204452.19917: Set connection var ansible_shell_type to sh 34052 1727204452.19932: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204452.19945: Set connection var ansible_shell_executable to /bin/sh 34052 1727204452.19979: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.19992: variable 'ansible_connection' from source: unknown 34052 1727204452.20000: variable 'ansible_module_compression' from source: unknown 34052 1727204452.20007: variable 'ansible_shell_type' from source: unknown 34052 1727204452.20015: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.20022: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.20030: variable 'ansible_pipelining' from source: unknown 34052 1727204452.20037: variable 'ansible_timeout' from source: unknown 34052 1727204452.20046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.20209: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204452.20228: variable 'omit' from source: magic vars 34052 1727204452.20316: starting attempt loop 34052 1727204452.20319: running the handler 34052 1727204452.20322: handler run complete 34052 1727204452.20324: attempt loop complete, returning result 34052 1727204452.20329: _execute() done 34052 1727204452.20337: dumping result to json 34052 1727204452.20344: done dumping result, returning 34052 1727204452.20356: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [127b8e07-fff9-66a4-e2a3-0000000006e5] 34052 1727204452.20364: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000006e5 34052 1727204452.20672: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000006e5 34052 1727204452.20676: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 34052 1727204452.20728: no more pending results, returning what we have 34052 1727204452.20732: results queue empty 34052 1727204452.20733: checking for any_errors_fatal 34052 1727204452.20740: done checking for any_errors_fatal 34052 1727204452.20741: checking for max_fail_percentage 34052 1727204452.20743: done checking for max_fail_percentage 34052 1727204452.20744: checking to see if all hosts have failed and the running result is not ok 34052 1727204452.20745: done checking to see if all hosts have failed 34052 1727204452.20746: getting the remaining hosts for this loop 34052 1727204452.20748: done getting the remaining hosts for this loop 34052 1727204452.20753: getting the next task for host managed-node1 34052 1727204452.20762: done getting next task for host managed-node1 34052 1727204452.20767: ^ task is: TASK: Install iproute 34052 1727204452.20771: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204452.20776: getting variables 34052 1727204452.20778: in VariableManager get_vars() 34052 1727204452.20826: Calling all_inventory to load vars for managed-node1 34052 1727204452.20829: Calling groups_inventory to load vars for managed-node1 34052 1727204452.20831: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204452.20845: Calling all_plugins_play to load vars for managed-node1 34052 1727204452.20848: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204452.20852: Calling groups_plugins_play to load vars for managed-node1 34052 1727204452.22128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204452.23739: done with get_vars() 34052 1727204452.23775: done getting variables 34052 1727204452.23840: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.061) 0:00:38.555 ***** 34052 1727204452.23877: entering _queue_task() for managed-node1/package 34052 1727204452.24277: worker is 1 (out of 1 available) 34052 1727204452.24293: exiting _queue_task() for managed-node1/package 34052 1727204452.24308: done queuing things up, now waiting for results queue to drain 34052 1727204452.24310: waiting for pending results... 34052 1727204452.24570: running TaskExecutor() for managed-node1/TASK: Install iproute 34052 1727204452.24650: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005cf 34052 1727204452.24661: variable 'ansible_search_path' from source: unknown 34052 1727204452.24669: variable 'ansible_search_path' from source: unknown 34052 1727204452.24702: calling self._execute() 34052 1727204452.24790: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.24794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.24806: variable 'omit' from source: magic vars 34052 1727204452.25130: variable 'ansible_distribution_major_version' from source: facts 34052 1727204452.25143: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204452.25149: variable 'omit' from source: magic vars 34052 1727204452.25183: variable 'omit' from source: magic vars 34052 1727204452.25345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34052 1727204452.27475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34052 1727204452.27511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34052 1727204452.27562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34052 1727204452.27642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34052 1727204452.27681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34052 1727204452.27809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34052 1727204452.27832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34052 1727204452.27851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34052 1727204452.27882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34052 1727204452.27893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34052 1727204452.27993: variable '__network_is_ostree' from source: set_fact 34052 1727204452.27996: variable 'omit' from source: magic vars 34052 1727204452.28031: variable 'omit' from source: magic vars 34052 1727204452.28050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204452.28075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204452.28091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204452.28105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.28113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204452.28143: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204452.28146: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.28148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.28219: Set connection var ansible_connection to ssh 34052 1727204452.28228: Set connection var ansible_timeout to 10 34052 1727204452.28231: Set connection var ansible_pipelining to False 34052 1727204452.28234: Set connection var ansible_shell_type to sh 34052 1727204452.28245: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204452.28252: Set connection var ansible_shell_executable to /bin/sh 34052 1727204452.28275: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.28279: variable 'ansible_connection' from source: unknown 34052 1727204452.28282: variable 'ansible_module_compression' from source: unknown 34052 1727204452.28284: variable 'ansible_shell_type' from source: unknown 34052 1727204452.28287: variable 'ansible_shell_executable' from source: unknown 34052 1727204452.28289: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204452.28294: variable 'ansible_pipelining' from source: unknown 34052 1727204452.28296: variable 'ansible_timeout' from source: unknown 34052 1727204452.28301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204452.28383: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204452.28393: variable 'omit' from source: magic vars 34052 1727204452.28398: starting attempt loop 34052 1727204452.28401: running the handler 34052 1727204452.28407: variable 'ansible_facts' from source: unknown 34052 1727204452.28411: variable 'ansible_facts' from source: unknown 34052 1727204452.28441: _low_level_execute_command(): starting 34052 1727204452.28446: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204452.28999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204452.29006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.29010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.29069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.29074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204452.29076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.29148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.30968: stdout chunk (state=3): >>>/root <<< 34052 1727204452.31076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204452.31141: stderr chunk (state=3): >>><<< 34052 1727204452.31144: stdout chunk (state=3): >>><<< 34052 1727204452.31158: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204452.31185: _low_level_execute_command(): starting 34052 1727204452.31191: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957 `" && echo ansible-tmp-1727204452.3117344-36505-793705116957="` echo /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957 `" ) && sleep 0' 34052 1727204452.31672: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204452.31677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204452.31705: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.31709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.31712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204452.31714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.31775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.31779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204452.31783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.31839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.33891: stdout chunk (state=3): >>>ansible-tmp-1727204452.3117344-36505-793705116957=/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957 <<< 34052 1727204452.34015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204452.34074: stderr chunk (state=3): >>><<< 34052 1727204452.34078: stdout chunk (state=3): >>><<< 34052 1727204452.34093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204452.3117344-36505-793705116957=/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204452.34127: variable 'ansible_module_compression' from source: unknown 34052 1727204452.34182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 34052 1727204452.34220: variable 'ansible_facts' from source: unknown 34052 1727204452.34308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py 34052 1727204452.34433: Sending initial data 34052 1727204452.34436: Sent initial data (149 bytes) 34052 1727204452.34954: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204452.34959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.34961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204452.34963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204452.34969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.35023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.35029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204452.35035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.35091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.36817: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204452.36860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204452.36908: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpxb4gjs9u /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py <<< 34052 1727204452.36913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py" <<< 34052 1727204452.36963: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpxb4gjs9u" to remote "/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py" <<< 34052 1727204452.36966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py" <<< 34052 1727204452.37772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204452.37851: stderr chunk (state=3): >>><<< 34052 1727204452.37855: stdout chunk (state=3): >>><<< 34052 1727204452.37877: done transferring module to remote 34052 1727204452.37888: _low_level_execute_command(): starting 34052 1727204452.37897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/ /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py && sleep 0' 34052 1727204452.38417: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.38422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34052 1727204452.38425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204452.38428: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.38473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.38494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.38545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204452.40548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204452.40579: stderr chunk (state=3): >>><<< 34052 1727204452.40582: stdout chunk (state=3): >>><<< 34052 1727204452.40598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204452.40600: _low_level_execute_command(): starting 34052 1727204452.40606: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/AnsiballZ_dnf.py && sleep 0' 34052 1727204452.41108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204452.41112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.41114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204452.41117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204452.41182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204452.41188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204452.41190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204452.41248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204453.55578: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34052 1727204453.60473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204453.60481: stderr chunk (state=3): >>><<< 34052 1727204453.60487: stdout chunk (state=3): >>><<< 34052 1727204453.60514: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204453.60570: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204453.60575: _low_level_execute_command(): starting 34052 1727204453.60581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204452.3117344-36505-793705116957/ > /dev/null 2>&1 && sleep 0' 34052 1727204453.62171: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204453.62338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204453.62404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204453.62464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204453.62492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204453.62592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204453.80672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204453.80773: stdout chunk (state=3): >>><<< 34052 1727204453.80777: stderr chunk (state=3): >>><<< 34052 1727204453.80780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204453.80783: handler run complete 34052 1727204453.80923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34052 1727204453.81139: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34052 1727204453.81187: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34052 1727204453.81224: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34052 1727204453.81257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34052 1727204453.81338: variable '__install_status' from source: set_fact 34052 1727204453.81362: Evaluated conditional (__install_status is success): True 34052 1727204453.81387: attempt loop complete, returning result 34052 1727204453.81393: _execute() done 34052 1727204453.81399: dumping result to json 34052 1727204453.81408: done dumping result, returning 34052 1727204453.81418: done running TaskExecutor() for managed-node1/TASK: Install iproute [127b8e07-fff9-66a4-e2a3-0000000005cf] 34052 1727204453.81425: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cf ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34052 1727204453.81638: no more pending results, returning what we have 34052 1727204453.81871: results queue empty 34052 1727204453.81873: checking for any_errors_fatal 34052 1727204453.81879: done checking for any_errors_fatal 34052 1727204453.81880: checking for max_fail_percentage 34052 1727204453.81882: done checking for max_fail_percentage 34052 1727204453.81882: checking to see if all hosts have failed and the running result is not ok 34052 1727204453.81883: done checking to see if all hosts have failed 34052 1727204453.81884: getting the remaining hosts for this loop 34052 1727204453.81886: done getting the remaining hosts for this loop 34052 1727204453.81890: getting the next task for host managed-node1 34052 1727204453.81896: done getting next task for host managed-node1 34052 1727204453.81898: ^ task is: TASK: Create veth interface {{ interface }} 34052 1727204453.81901: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204453.81905: getting variables 34052 1727204453.81907: in VariableManager get_vars() 34052 1727204453.81955: Calling all_inventory to load vars for managed-node1 34052 1727204453.81958: Calling groups_inventory to load vars for managed-node1 34052 1727204453.81961: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204453.81976: Calling all_plugins_play to load vars for managed-node1 34052 1727204453.81979: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204453.81983: Calling groups_plugins_play to load vars for managed-node1 34052 1727204453.82659: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005cf 34052 1727204453.82663: WORKER PROCESS EXITING 34052 1727204453.84509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204453.86812: done with get_vars() 34052 1727204453.86852: done getting variables 34052 1727204453.86926: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204453.87062: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:53 -0400 (0:00:01.632) 0:00:40.188 ***** 34052 1727204453.87106: entering _queue_task() for managed-node1/command 34052 1727204453.87507: worker is 1 (out of 1 available) 34052 1727204453.87634: exiting _queue_task() for managed-node1/command 34052 1727204453.87650: done queuing things up, now waiting for results queue to drain 34052 1727204453.87651: waiting for pending results... 34052 1727204453.87883: running TaskExecutor() for managed-node1/TASK: Create veth interface veth0 34052 1727204453.87995: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d0 34052 1727204453.88017: variable 'ansible_search_path' from source: unknown 34052 1727204453.88021: variable 'ansible_search_path' from source: unknown 34052 1727204453.88333: variable 'interface' from source: play vars 34052 1727204453.88433: variable 'interface' from source: play vars 34052 1727204453.88514: variable 'interface' from source: play vars 34052 1727204453.88680: Loaded config def from plugin (lookup/items) 34052 1727204453.88687: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34052 1727204453.88710: variable 'omit' from source: magic vars 34052 1727204453.88887: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204453.88898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204453.88911: variable 'omit' from source: magic vars 34052 1727204453.89211: variable 'ansible_distribution_major_version' from source: facts 34052 1727204453.89219: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204453.89456: variable 'type' from source: play vars 34052 1727204453.89460: variable 'state' from source: include params 34052 1727204453.89671: variable 'interface' from source: play vars 34052 1727204453.89675: variable 'current_interfaces' from source: set_fact 34052 1727204453.89677: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34052 1727204453.89680: when evaluation is False, skipping this task 34052 1727204453.89682: variable 'item' from source: unknown 34052 1727204453.89684: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 34052 1727204453.89848: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204453.89852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204453.89855: variable 'omit' from source: magic vars 34052 1727204453.90073: variable 'ansible_distribution_major_version' from source: facts 34052 1727204453.90077: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204453.90183: variable 'type' from source: play vars 34052 1727204453.90187: variable 'state' from source: include params 34052 1727204453.90193: variable 'interface' from source: play vars 34052 1727204453.90196: variable 'current_interfaces' from source: set_fact 34052 1727204453.90198: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34052 1727204453.90201: when evaluation is False, skipping this task 34052 1727204453.90203: variable 'item' from source: unknown 34052 1727204453.90255: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 34052 1727204453.90349: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204453.90353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204453.90356: variable 'omit' from source: magic vars 34052 1727204453.90521: variable 'ansible_distribution_major_version' from source: facts 34052 1727204453.90525: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204453.90836: variable 'type' from source: play vars 34052 1727204453.90840: variable 'state' from source: include params 34052 1727204453.90843: variable 'interface' from source: play vars 34052 1727204453.90845: variable 'current_interfaces' from source: set_fact 34052 1727204453.90847: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34052 1727204453.90850: when evaluation is False, skipping this task 34052 1727204453.90852: variable 'item' from source: unknown 34052 1727204453.90854: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 34052 1727204453.91021: dumping result to json 34052 1727204453.91027: done dumping result, returning 34052 1727204453.91031: done running TaskExecutor() for managed-node1/TASK: Create veth interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d0] 34052 1727204453.91034: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d0 34052 1727204453.91082: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d0 34052 1727204453.91085: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false } MSG: All items skipped 34052 1727204453.91134: no more pending results, returning what we have 34052 1727204453.91138: results queue empty 34052 1727204453.91139: checking for any_errors_fatal 34052 1727204453.91151: done checking for any_errors_fatal 34052 1727204453.91152: checking for max_fail_percentage 34052 1727204453.91153: done checking for max_fail_percentage 34052 1727204453.91154: checking to see if all hosts have failed and the running result is not ok 34052 1727204453.91155: done checking to see if all hosts have failed 34052 1727204453.91156: getting the remaining hosts for this loop 34052 1727204453.91158: done getting the remaining hosts for this loop 34052 1727204453.91163: getting the next task for host managed-node1 34052 1727204453.91170: done getting next task for host managed-node1 34052 1727204453.91174: ^ task is: TASK: Set up veth as managed by NetworkManager 34052 1727204453.91178: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204453.91183: getting variables 34052 1727204453.91185: in VariableManager get_vars() 34052 1727204453.91233: Calling all_inventory to load vars for managed-node1 34052 1727204453.91237: Calling groups_inventory to load vars for managed-node1 34052 1727204453.91240: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204453.91254: Calling all_plugins_play to load vars for managed-node1 34052 1727204453.91258: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204453.91262: Calling groups_plugins_play to load vars for managed-node1 34052 1727204453.93450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204453.95644: done with get_vars() 34052 1727204453.95685: done getting variables 34052 1727204453.95758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:53 -0400 (0:00:00.086) 0:00:40.275 ***** 34052 1727204453.95797: entering _queue_task() for managed-node1/command 34052 1727204453.96393: worker is 1 (out of 1 available) 34052 1727204453.96406: exiting _queue_task() for managed-node1/command 34052 1727204453.96418: done queuing things up, now waiting for results queue to drain 34052 1727204453.96420: waiting for pending results... 34052 1727204453.96685: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 34052 1727204453.96691: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d1 34052 1727204453.96694: variable 'ansible_search_path' from source: unknown 34052 1727204453.96697: variable 'ansible_search_path' from source: unknown 34052 1727204453.96699: calling self._execute() 34052 1727204453.96817: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204453.96821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204453.96832: variable 'omit' from source: magic vars 34052 1727204453.97256: variable 'ansible_distribution_major_version' from source: facts 34052 1727204453.97270: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204453.97456: variable 'type' from source: play vars 34052 1727204453.97460: variable 'state' from source: include params 34052 1727204453.97468: Evaluated conditional (type == 'veth' and state == 'present'): False 34052 1727204453.97472: when evaluation is False, skipping this task 34052 1727204453.97475: _execute() done 34052 1727204453.97478: dumping result to json 34052 1727204453.97481: done dumping result, returning 34052 1727204453.97489: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-66a4-e2a3-0000000005d1] 34052 1727204453.97494: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d1 34052 1727204453.97602: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d1 34052 1727204453.97605: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 34052 1727204453.97675: no more pending results, returning what we have 34052 1727204453.97680: results queue empty 34052 1727204453.97681: checking for any_errors_fatal 34052 1727204453.97694: done checking for any_errors_fatal 34052 1727204453.97694: checking for max_fail_percentage 34052 1727204453.97696: done checking for max_fail_percentage 34052 1727204453.97697: checking to see if all hosts have failed and the running result is not ok 34052 1727204453.97698: done checking to see if all hosts have failed 34052 1727204453.97699: getting the remaining hosts for this loop 34052 1727204453.97701: done getting the remaining hosts for this loop 34052 1727204453.97706: getting the next task for host managed-node1 34052 1727204453.97713: done getting next task for host managed-node1 34052 1727204453.97716: ^ task is: TASK: Delete veth interface {{ interface }} 34052 1727204453.97720: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204453.97726: getting variables 34052 1727204453.97728: in VariableManager get_vars() 34052 1727204453.97778: Calling all_inventory to load vars for managed-node1 34052 1727204453.97781: Calling groups_inventory to load vars for managed-node1 34052 1727204453.97783: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204453.97799: Calling all_plugins_play to load vars for managed-node1 34052 1727204453.97802: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204453.97806: Calling groups_plugins_play to load vars for managed-node1 34052 1727204453.99789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.02183: done with get_vars() 34052 1727204454.02226: done getting variables 34052 1727204454.02297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204454.02427: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.066) 0:00:40.341 ***** 34052 1727204454.02470: entering _queue_task() for managed-node1/command 34052 1727204454.03087: worker is 1 (out of 1 available) 34052 1727204454.03100: exiting _queue_task() for managed-node1/command 34052 1727204454.03111: done queuing things up, now waiting for results queue to drain 34052 1727204454.03113: waiting for pending results... 34052 1727204454.03338: running TaskExecutor() for managed-node1/TASK: Delete veth interface veth0 34052 1727204454.03377: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d2 34052 1727204454.03384: variable 'ansible_search_path' from source: unknown 34052 1727204454.03388: variable 'ansible_search_path' from source: unknown 34052 1727204454.03537: calling self._execute() 34052 1727204454.03549: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.03561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.03574: variable 'omit' from source: magic vars 34052 1727204454.03986: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.04003: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.04243: variable 'type' from source: play vars 34052 1727204454.04247: variable 'state' from source: include params 34052 1727204454.04253: variable 'interface' from source: play vars 34052 1727204454.04256: variable 'current_interfaces' from source: set_fact 34052 1727204454.04271: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 34052 1727204454.04278: variable 'omit' from source: magic vars 34052 1727204454.04332: variable 'omit' from source: magic vars 34052 1727204454.04440: variable 'interface' from source: play vars 34052 1727204454.04520: variable 'omit' from source: magic vars 34052 1727204454.04524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204454.04550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204454.04575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204454.04598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204454.04610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204454.04648: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204454.04652: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.04656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.04777: Set connection var ansible_connection to ssh 34052 1727204454.04786: Set connection var ansible_timeout to 10 34052 1727204454.04793: Set connection var ansible_pipelining to False 34052 1727204454.04796: Set connection var ansible_shell_type to sh 34052 1727204454.04838: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204454.04845: Set connection var ansible_shell_executable to /bin/sh 34052 1727204454.04848: variable 'ansible_shell_executable' from source: unknown 34052 1727204454.04851: variable 'ansible_connection' from source: unknown 34052 1727204454.04853: variable 'ansible_module_compression' from source: unknown 34052 1727204454.04855: variable 'ansible_shell_type' from source: unknown 34052 1727204454.04859: variable 'ansible_shell_executable' from source: unknown 34052 1727204454.04862: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.04907: variable 'ansible_pipelining' from source: unknown 34052 1727204454.04910: variable 'ansible_timeout' from source: unknown 34052 1727204454.04912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.05041: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204454.05055: variable 'omit' from source: magic vars 34052 1727204454.05174: starting attempt loop 34052 1727204454.05178: running the handler 34052 1727204454.05184: _low_level_execute_command(): starting 34052 1727204454.05187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204454.06064: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204454.06073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.06076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.06078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204454.06081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204454.06083: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204454.06085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.06087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204454.06091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.06094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.06178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.07953: stdout chunk (state=3): >>>/root <<< 34052 1727204454.08087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.08189: stderr chunk (state=3): >>><<< 34052 1727204454.08193: stdout chunk (state=3): >>><<< 34052 1727204454.08215: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.08272: _low_level_execute_command(): starting 34052 1727204454.08279: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236 `" && echo ansible-tmp-1727204454.0822172-36564-192804846938236="` echo /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236 `" ) && sleep 0' 34052 1727204454.08953: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204454.08972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.08987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.09005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204454.09046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.09082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.09153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.09180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.09214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.09280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.11383: stdout chunk (state=3): >>>ansible-tmp-1727204454.0822172-36564-192804846938236=/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236 <<< 34052 1727204454.11693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.11774: stderr chunk (state=3): >>><<< 34052 1727204454.11780: stdout chunk (state=3): >>><<< 34052 1727204454.11784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204454.0822172-36564-192804846938236=/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.11807: variable 'ansible_module_compression' from source: unknown 34052 1727204454.11945: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204454.12211: variable 'ansible_facts' from source: unknown 34052 1727204454.12256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py 34052 1727204454.12572: Sending initial data 34052 1727204454.12581: Sent initial data (156 bytes) 34052 1727204454.13755: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.13867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.14021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.14052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.14210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.14260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.16050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204454.16175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204454.16227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp4stfgn_1 /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py <<< 34052 1727204454.16230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py" <<< 34052 1727204454.16281: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp4stfgn_1" to remote "/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py" <<< 34052 1727204454.17113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.17275: stderr chunk (state=3): >>><<< 34052 1727204454.17279: stdout chunk (state=3): >>><<< 34052 1727204454.17282: done transferring module to remote 34052 1727204454.17284: _low_level_execute_command(): starting 34052 1727204454.17287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/ /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py && sleep 0' 34052 1727204454.18046: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.18115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.18143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.18193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.18258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.20320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.20324: stdout chunk (state=3): >>><<< 34052 1727204454.20326: stderr chunk (state=3): >>><<< 34052 1727204454.20344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.20351: _low_level_execute_command(): starting 34052 1727204454.20359: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/AnsiballZ_command.py && sleep 0' 34052 1727204454.21149: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.21227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.21273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.21318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.21395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.39891: stdout chunk (state=3): >>> <<< 34052 1727204454.39959: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:00:54.385641", "end": "2024-09-24 15:00:54.396797", "delta": "0:00:00.011156", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204454.42949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204454.42954: stdout chunk (state=3): >>><<< 34052 1727204454.43172: stderr chunk (state=3): >>><<< 34052 1727204454.43177: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:00:54.385641", "end": "2024-09-24 15:00:54.396797", "delta": "0:00:00.011156", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204454.43181: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204454.43184: _low_level_execute_command(): starting 34052 1727204454.43186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204454.0822172-36564-192804846938236/ > /dev/null 2>&1 && sleep 0' 34052 1727204454.43742: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204454.43751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.43764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.43784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204454.43796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 <<< 34052 1727204454.43803: stderr chunk (state=3): >>>debug2: match not found <<< 34052 1727204454.43813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.43839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204454.43849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address <<< 34052 1727204454.43887: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.43952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.43985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.43991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.44079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.46177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.46181: stderr chunk (state=3): >>><<< 34052 1727204454.46184: stdout chunk (state=3): >>><<< 34052 1727204454.46186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.46189: handler run complete 34052 1727204454.46209: Evaluated conditional (False): False 34052 1727204454.46229: attempt loop complete, returning result 34052 1727204454.46232: _execute() done 34052 1727204454.46238: dumping result to json 34052 1727204454.46243: done dumping result, returning 34052 1727204454.46471: done running TaskExecutor() for managed-node1/TASK: Delete veth interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d2] 34052 1727204454.46475: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d2 34052 1727204454.46549: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d2 34052 1727204454.46552: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.011156", "end": "2024-09-24 15:00:54.396797", "rc": 0, "start": "2024-09-24 15:00:54.385641" } 34052 1727204454.46645: no more pending results, returning what we have 34052 1727204454.46834: results queue empty 34052 1727204454.46836: checking for any_errors_fatal 34052 1727204454.46842: done checking for any_errors_fatal 34052 1727204454.46843: checking for max_fail_percentage 34052 1727204454.46845: done checking for max_fail_percentage 34052 1727204454.46846: checking to see if all hosts have failed and the running result is not ok 34052 1727204454.46847: done checking to see if all hosts have failed 34052 1727204454.46848: getting the remaining hosts for this loop 34052 1727204454.46849: done getting the remaining hosts for this loop 34052 1727204454.46854: getting the next task for host managed-node1 34052 1727204454.46860: done getting next task for host managed-node1 34052 1727204454.46864: ^ task is: TASK: Create dummy interface {{ interface }} 34052 1727204454.46869: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204454.46873: getting variables 34052 1727204454.46875: in VariableManager get_vars() 34052 1727204454.46916: Calling all_inventory to load vars for managed-node1 34052 1727204454.46919: Calling groups_inventory to load vars for managed-node1 34052 1727204454.46922: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204454.46934: Calling all_plugins_play to load vars for managed-node1 34052 1727204454.46942: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204454.46946: Calling groups_plugins_play to load vars for managed-node1 34052 1727204454.48687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.50931: done with get_vars() 34052 1727204454.50980: done getting variables 34052 1727204454.51055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204454.51190: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.487) 0:00:40.829 ***** 34052 1727204454.51228: entering _queue_task() for managed-node1/command 34052 1727204454.51882: worker is 1 (out of 1 available) 34052 1727204454.51895: exiting _queue_task() for managed-node1/command 34052 1727204454.51908: done queuing things up, now waiting for results queue to drain 34052 1727204454.51910: waiting for pending results... 34052 1727204454.52034: running TaskExecutor() for managed-node1/TASK: Create dummy interface veth0 34052 1727204454.52158: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d3 34052 1727204454.52178: variable 'ansible_search_path' from source: unknown 34052 1727204454.52182: variable 'ansible_search_path' from source: unknown 34052 1727204454.52228: calling self._execute() 34052 1727204454.52346: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.52358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.52374: variable 'omit' from source: magic vars 34052 1727204454.52835: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.52843: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.53096: variable 'type' from source: play vars 34052 1727204454.53100: variable 'state' from source: include params 34052 1727204454.53104: variable 'interface' from source: play vars 34052 1727204454.53109: variable 'current_interfaces' from source: set_fact 34052 1727204454.53125: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 34052 1727204454.53131: when evaluation is False, skipping this task 34052 1727204454.53135: _execute() done 34052 1727204454.53138: dumping result to json 34052 1727204454.53173: done dumping result, returning 34052 1727204454.53177: done running TaskExecutor() for managed-node1/TASK: Create dummy interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d3] 34052 1727204454.53179: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d3 34052 1727204454.53440: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d3 34052 1727204454.53443: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204454.53509: no more pending results, returning what we have 34052 1727204454.53512: results queue empty 34052 1727204454.53513: checking for any_errors_fatal 34052 1727204454.53521: done checking for any_errors_fatal 34052 1727204454.53522: checking for max_fail_percentage 34052 1727204454.53523: done checking for max_fail_percentage 34052 1727204454.53524: checking to see if all hosts have failed and the running result is not ok 34052 1727204454.53525: done checking to see if all hosts have failed 34052 1727204454.53526: getting the remaining hosts for this loop 34052 1727204454.53527: done getting the remaining hosts for this loop 34052 1727204454.53531: getting the next task for host managed-node1 34052 1727204454.53537: done getting next task for host managed-node1 34052 1727204454.53539: ^ task is: TASK: Delete dummy interface {{ interface }} 34052 1727204454.53543: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204454.53636: getting variables 34052 1727204454.53638: in VariableManager get_vars() 34052 1727204454.53686: Calling all_inventory to load vars for managed-node1 34052 1727204454.53689: Calling groups_inventory to load vars for managed-node1 34052 1727204454.53692: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204454.53703: Calling all_plugins_play to load vars for managed-node1 34052 1727204454.53706: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204454.53709: Calling groups_plugins_play to load vars for managed-node1 34052 1727204454.55711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.57761: done with get_vars() 34052 1727204454.57794: done getting variables 34052 1727204454.57848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204454.57939: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.067) 0:00:40.896 ***** 34052 1727204454.57968: entering _queue_task() for managed-node1/command 34052 1727204454.58263: worker is 1 (out of 1 available) 34052 1727204454.58279: exiting _queue_task() for managed-node1/command 34052 1727204454.58294: done queuing things up, now waiting for results queue to drain 34052 1727204454.58296: waiting for pending results... 34052 1727204454.58505: running TaskExecutor() for managed-node1/TASK: Delete dummy interface veth0 34052 1727204454.58591: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d4 34052 1727204454.58604: variable 'ansible_search_path' from source: unknown 34052 1727204454.58607: variable 'ansible_search_path' from source: unknown 34052 1727204454.58648: calling self._execute() 34052 1727204454.58749: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.58753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.58764: variable 'omit' from source: magic vars 34052 1727204454.59077: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.59087: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.59250: variable 'type' from source: play vars 34052 1727204454.59254: variable 'state' from source: include params 34052 1727204454.59257: variable 'interface' from source: play vars 34052 1727204454.59260: variable 'current_interfaces' from source: set_fact 34052 1727204454.59270: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 34052 1727204454.59272: when evaluation is False, skipping this task 34052 1727204454.59275: _execute() done 34052 1727204454.59278: dumping result to json 34052 1727204454.59288: done dumping result, returning 34052 1727204454.59316: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d4] 34052 1727204454.59320: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d4 34052 1727204454.59425: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d4 34052 1727204454.59429: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204454.59514: no more pending results, returning what we have 34052 1727204454.59519: results queue empty 34052 1727204454.59520: checking for any_errors_fatal 34052 1727204454.59528: done checking for any_errors_fatal 34052 1727204454.59529: checking for max_fail_percentage 34052 1727204454.59531: done checking for max_fail_percentage 34052 1727204454.59532: checking to see if all hosts have failed and the running result is not ok 34052 1727204454.59532: done checking to see if all hosts have failed 34052 1727204454.59533: getting the remaining hosts for this loop 34052 1727204454.59535: done getting the remaining hosts for this loop 34052 1727204454.59539: getting the next task for host managed-node1 34052 1727204454.59546: done getting next task for host managed-node1 34052 1727204454.59550: ^ task is: TASK: Create tap interface {{ interface }} 34052 1727204454.59553: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204454.59557: getting variables 34052 1727204454.59559: in VariableManager get_vars() 34052 1727204454.59635: Calling all_inventory to load vars for managed-node1 34052 1727204454.59638: Calling groups_inventory to load vars for managed-node1 34052 1727204454.59640: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204454.59655: Calling all_plugins_play to load vars for managed-node1 34052 1727204454.59658: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204454.59661: Calling groups_plugins_play to load vars for managed-node1 34052 1727204454.61263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.62573: done with get_vars() 34052 1727204454.62596: done getting variables 34052 1727204454.62652: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204454.62746: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.048) 0:00:40.944 ***** 34052 1727204454.62780: entering _queue_task() for managed-node1/command 34052 1727204454.63267: worker is 1 (out of 1 available) 34052 1727204454.63282: exiting _queue_task() for managed-node1/command 34052 1727204454.63297: done queuing things up, now waiting for results queue to drain 34052 1727204454.63299: waiting for pending results... 34052 1727204454.63786: running TaskExecutor() for managed-node1/TASK: Create tap interface veth0 34052 1727204454.63793: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d5 34052 1727204454.63801: variable 'ansible_search_path' from source: unknown 34052 1727204454.63804: variable 'ansible_search_path' from source: unknown 34052 1727204454.63862: calling self._execute() 34052 1727204454.64000: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.64006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.64018: variable 'omit' from source: magic vars 34052 1727204454.64524: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.64773: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.64822: variable 'type' from source: play vars 34052 1727204454.64826: variable 'state' from source: include params 34052 1727204454.64834: variable 'interface' from source: play vars 34052 1727204454.64837: variable 'current_interfaces' from source: set_fact 34052 1727204454.64847: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 34052 1727204454.64851: when evaluation is False, skipping this task 34052 1727204454.64853: _execute() done 34052 1727204454.64856: dumping result to json 34052 1727204454.64859: done dumping result, returning 34052 1727204454.64871: done running TaskExecutor() for managed-node1/TASK: Create tap interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d5] 34052 1727204454.64875: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d5 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204454.65042: no more pending results, returning what we have 34052 1727204454.65047: results queue empty 34052 1727204454.65048: checking for any_errors_fatal 34052 1727204454.65055: done checking for any_errors_fatal 34052 1727204454.65056: checking for max_fail_percentage 34052 1727204454.65058: done checking for max_fail_percentage 34052 1727204454.65060: checking to see if all hosts have failed and the running result is not ok 34052 1727204454.65061: done checking to see if all hosts have failed 34052 1727204454.65062: getting the remaining hosts for this loop 34052 1727204454.65064: done getting the remaining hosts for this loop 34052 1727204454.65072: getting the next task for host managed-node1 34052 1727204454.65082: done getting next task for host managed-node1 34052 1727204454.65085: ^ task is: TASK: Delete tap interface {{ interface }} 34052 1727204454.65091: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204454.65097: getting variables 34052 1727204454.65099: in VariableManager get_vars() 34052 1727204454.65160: Calling all_inventory to load vars for managed-node1 34052 1727204454.65164: Calling groups_inventory to load vars for managed-node1 34052 1727204454.65175: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204454.65185: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d5 34052 1727204454.65204: Calling all_plugins_play to load vars for managed-node1 34052 1727204454.65209: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204454.65214: Calling groups_plugins_play to load vars for managed-node1 34052 1727204454.65813: WORKER PROCESS EXITING 34052 1727204454.67159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.69449: done with get_vars() 34052 1727204454.69500: done getting variables 34052 1727204454.69587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34052 1727204454.69724: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.069) 0:00:41.014 ***** 34052 1727204454.69761: entering _queue_task() for managed-node1/command 34052 1727204454.70191: worker is 1 (out of 1 available) 34052 1727204454.70205: exiting _queue_task() for managed-node1/command 34052 1727204454.70219: done queuing things up, now waiting for results queue to drain 34052 1727204454.70220: waiting for pending results... 34052 1727204454.70600: running TaskExecutor() for managed-node1/TASK: Delete tap interface veth0 34052 1727204454.70700: in run() - task 127b8e07-fff9-66a4-e2a3-0000000005d6 34052 1727204454.70724: variable 'ansible_search_path' from source: unknown 34052 1727204454.70732: variable 'ansible_search_path' from source: unknown 34052 1727204454.70781: calling self._execute() 34052 1727204454.70910: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.71071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.71075: variable 'omit' from source: magic vars 34052 1727204454.71354: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.71375: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.71618: variable 'type' from source: play vars 34052 1727204454.71633: variable 'state' from source: include params 34052 1727204454.71643: variable 'interface' from source: play vars 34052 1727204454.71651: variable 'current_interfaces' from source: set_fact 34052 1727204454.71663: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 34052 1727204454.71673: when evaluation is False, skipping this task 34052 1727204454.71680: _execute() done 34052 1727204454.71687: dumping result to json 34052 1727204454.71695: done dumping result, returning 34052 1727204454.71734: done running TaskExecutor() for managed-node1/TASK: Delete tap interface veth0 [127b8e07-fff9-66a4-e2a3-0000000005d6] 34052 1727204454.71737: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d6 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34052 1727204454.71894: no more pending results, returning what we have 34052 1727204454.71899: results queue empty 34052 1727204454.71900: checking for any_errors_fatal 34052 1727204454.71908: done checking for any_errors_fatal 34052 1727204454.71909: checking for max_fail_percentage 34052 1727204454.71911: done checking for max_fail_percentage 34052 1727204454.71912: checking to see if all hosts have failed and the running result is not ok 34052 1727204454.71913: done checking to see if all hosts have failed 34052 1727204454.71914: getting the remaining hosts for this loop 34052 1727204454.71916: done getting the remaining hosts for this loop 34052 1727204454.71921: getting the next task for host managed-node1 34052 1727204454.71930: done getting next task for host managed-node1 34052 1727204454.71934: ^ task is: TASK: Clean up namespace 34052 1727204454.71938: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204454.71943: getting variables 34052 1727204454.71945: in VariableManager get_vars() 34052 1727204454.71999: Calling all_inventory to load vars for managed-node1 34052 1727204454.72003: Calling groups_inventory to load vars for managed-node1 34052 1727204454.72005: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204454.72023: Calling all_plugins_play to load vars for managed-node1 34052 1727204454.72026: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204454.72030: Calling groups_plugins_play to load vars for managed-node1 34052 1727204454.72686: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000005d6 34052 1727204454.72691: WORKER PROCESS EXITING 34052 1727204454.79877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204454.82049: done with get_vars() 34052 1727204454.82098: done getting variables 34052 1727204454.82151: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.124) 0:00:41.138 ***** 34052 1727204454.82180: entering _queue_task() for managed-node1/command 34052 1727204454.82801: worker is 1 (out of 1 available) 34052 1727204454.82814: exiting _queue_task() for managed-node1/command 34052 1727204454.82826: done queuing things up, now waiting for results queue to drain 34052 1727204454.82827: waiting for pending results... 34052 1727204454.82929: running TaskExecutor() for managed-node1/TASK: Clean up namespace 34052 1727204454.83061: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000b4 34052 1727204454.83087: variable 'ansible_search_path' from source: unknown 34052 1727204454.83135: calling self._execute() 34052 1727204454.83263: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.83284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.83303: variable 'omit' from source: magic vars 34052 1727204454.83732: variable 'ansible_distribution_major_version' from source: facts 34052 1727204454.83752: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204454.83764: variable 'omit' from source: magic vars 34052 1727204454.83791: variable 'omit' from source: magic vars 34052 1727204454.83844: variable 'omit' from source: magic vars 34052 1727204454.83895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204454.84036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204454.84039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204454.84043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204454.84046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204454.84050: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204454.84059: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.84070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.84186: Set connection var ansible_connection to ssh 34052 1727204454.84201: Set connection var ansible_timeout to 10 34052 1727204454.84213: Set connection var ansible_pipelining to False 34052 1727204454.84221: Set connection var ansible_shell_type to sh 34052 1727204454.84235: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204454.84252: Set connection var ansible_shell_executable to /bin/sh 34052 1727204454.84287: variable 'ansible_shell_executable' from source: unknown 34052 1727204454.84296: variable 'ansible_connection' from source: unknown 34052 1727204454.84303: variable 'ansible_module_compression' from source: unknown 34052 1727204454.84310: variable 'ansible_shell_type' from source: unknown 34052 1727204454.84316: variable 'ansible_shell_executable' from source: unknown 34052 1727204454.84322: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204454.84329: variable 'ansible_pipelining' from source: unknown 34052 1727204454.84335: variable 'ansible_timeout' from source: unknown 34052 1727204454.84342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204454.84583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204454.84587: variable 'omit' from source: magic vars 34052 1727204454.84589: starting attempt loop 34052 1727204454.84592: running the handler 34052 1727204454.84594: _low_level_execute_command(): starting 34052 1727204454.84597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204454.85356: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34052 1727204454.85381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.85465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.85512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.85532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.85569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.85655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.87419: stdout chunk (state=3): >>>/root <<< 34052 1727204454.87538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.87592: stderr chunk (state=3): >>><<< 34052 1727204454.87595: stdout chunk (state=3): >>><<< 34052 1727204454.87618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.87637: _low_level_execute_command(): starting 34052 1727204454.87645: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873 `" && echo ansible-tmp-1727204454.8761954-36590-270030194506873="` echo /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873 `" ) && sleep 0' 34052 1727204454.88124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.88130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.88134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration <<< 34052 1727204454.88145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204454.88147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.88200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.88203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.88205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.88251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.90309: stdout chunk (state=3): >>>ansible-tmp-1727204454.8761954-36590-270030194506873=/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873 <<< 34052 1727204454.90529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.90535: stdout chunk (state=3): >>><<< 34052 1727204454.90538: stderr chunk (state=3): >>><<< 34052 1727204454.90628: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204454.8761954-36590-270030194506873=/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.90633: variable 'ansible_module_compression' from source: unknown 34052 1727204454.90700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204454.90737: variable 'ansible_facts' from source: unknown 34052 1727204454.90815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py 34052 1727204454.90939: Sending initial data 34052 1727204454.90959: Sent initial data (156 bytes) 34052 1727204454.91449: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.91453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.91455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34052 1727204454.91458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.91460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.91515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.91519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.91577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.93294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204454.93340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204454.93387: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpi6hotikb /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py <<< 34052 1727204454.93391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py" <<< 34052 1727204454.93446: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 34052 1727204454.93464: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmpi6hotikb" to remote "/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py" <<< 34052 1727204454.94090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.94163: stderr chunk (state=3): >>><<< 34052 1727204454.94169: stdout chunk (state=3): >>><<< 34052 1727204454.94190: done transferring module to remote 34052 1727204454.94202: _low_level_execute_command(): starting 34052 1727204454.94208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/ /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py && sleep 0' 34052 1727204454.94711: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.94715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.94718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204454.94720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.94784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.94788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.94793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.94847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204454.96779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204454.96839: stderr chunk (state=3): >>><<< 34052 1727204454.96842: stdout chunk (state=3): >>><<< 34052 1727204454.96857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204454.96860: _low_level_execute_command(): starting 34052 1727204454.96867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/AnsiballZ_command.py && sleep 0' 34052 1727204454.97349: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.97353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.97383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204454.97387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204454.97437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204454.97442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204454.97452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204454.97526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.15243: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:00:55.145791", "end": "2024-09-24 15:00:55.151484", "delta": "0:00:00.005693", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204455.16997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204455.17002: stderr chunk (state=3): >>><<< 34052 1727204455.17005: stdout chunk (state=3): >>><<< 34052 1727204455.17177: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:00:55.145791", "end": "2024-09-24 15:00:55.151484", "delta": "0:00:00.005693", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204455.17182: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204455.17184: _low_level_execute_command(): starting 34052 1727204455.17187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204454.8761954-36590-270030194506873/ > /dev/null 2>&1 && sleep 0' 34052 1727204455.17878: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.17889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.17899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.17920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.18033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.20015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.20072: stderr chunk (state=3): >>><<< 34052 1727204455.20078: stdout chunk (state=3): >>><<< 34052 1727204455.20096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204455.20102: handler run complete 34052 1727204455.20122: Evaluated conditional (False): False 34052 1727204455.20133: attempt loop complete, returning result 34052 1727204455.20136: _execute() done 34052 1727204455.20138: dumping result to json 34052 1727204455.20144: done dumping result, returning 34052 1727204455.20152: done running TaskExecutor() for managed-node1/TASK: Clean up namespace [127b8e07-fff9-66a4-e2a3-0000000000b4] 34052 1727204455.20156: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b4 34052 1727204455.20269: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b4 34052 1727204455.20273: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.005693", "end": "2024-09-24 15:00:55.151484", "rc": 0, "start": "2024-09-24 15:00:55.145791" } 34052 1727204455.20364: no more pending results, returning what we have 34052 1727204455.20370: results queue empty 34052 1727204455.20371: checking for any_errors_fatal 34052 1727204455.20376: done checking for any_errors_fatal 34052 1727204455.20376: checking for max_fail_percentage 34052 1727204455.20378: done checking for max_fail_percentage 34052 1727204455.20379: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.20380: done checking to see if all hosts have failed 34052 1727204455.20381: getting the remaining hosts for this loop 34052 1727204455.20384: done getting the remaining hosts for this loop 34052 1727204455.20388: getting the next task for host managed-node1 34052 1727204455.20393: done getting next task for host managed-node1 34052 1727204455.20397: ^ task is: TASK: Verify network state restored to default 34052 1727204455.20399: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204455.20403: getting variables 34052 1727204455.20404: in VariableManager get_vars() 34052 1727204455.20446: Calling all_inventory to load vars for managed-node1 34052 1727204455.20449: Calling groups_inventory to load vars for managed-node1 34052 1727204455.20451: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.20463: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.20473: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.20477: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.21463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.22776: done with get_vars() 34052 1727204455.22797: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.406) 0:00:41.546 ***** 34052 1727204455.22880: entering _queue_task() for managed-node1/include_tasks 34052 1727204455.23176: worker is 1 (out of 1 available) 34052 1727204455.23191: exiting _queue_task() for managed-node1/include_tasks 34052 1727204455.23207: done queuing things up, now waiting for results queue to drain 34052 1727204455.23209: waiting for pending results... 34052 1727204455.23407: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 34052 1727204455.23496: in run() - task 127b8e07-fff9-66a4-e2a3-0000000000b5 34052 1727204455.23510: variable 'ansible_search_path' from source: unknown 34052 1727204455.23548: calling self._execute() 34052 1727204455.23639: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204455.23645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204455.23657: variable 'omit' from source: magic vars 34052 1727204455.23978: variable 'ansible_distribution_major_version' from source: facts 34052 1727204455.23991: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204455.23998: _execute() done 34052 1727204455.24001: dumping result to json 34052 1727204455.24004: done dumping result, returning 34052 1727204455.24011: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [127b8e07-fff9-66a4-e2a3-0000000000b5] 34052 1727204455.24016: sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b5 34052 1727204455.24122: done sending task result for task 127b8e07-fff9-66a4-e2a3-0000000000b5 34052 1727204455.24125: WORKER PROCESS EXITING 34052 1727204455.24154: no more pending results, returning what we have 34052 1727204455.24160: in VariableManager get_vars() 34052 1727204455.24214: Calling all_inventory to load vars for managed-node1 34052 1727204455.24217: Calling groups_inventory to load vars for managed-node1 34052 1727204455.24219: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.24234: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.24237: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.24240: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.25282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.26471: done with get_vars() 34052 1727204455.26494: variable 'ansible_search_path' from source: unknown 34052 1727204455.26508: we have included files to process 34052 1727204455.26509: generating all_blocks data 34052 1727204455.26510: done generating all_blocks data 34052 1727204455.26515: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34052 1727204455.26516: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34052 1727204455.26517: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34052 1727204455.26823: done processing included file 34052 1727204455.26825: iterating over new_blocks loaded from include file 34052 1727204455.26826: in VariableManager get_vars() 34052 1727204455.26841: done with get_vars() 34052 1727204455.26842: filtering new block on tags 34052 1727204455.26855: done filtering new block on tags 34052 1727204455.26857: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 34052 1727204455.26862: extending task lists for all hosts with included blocks 34052 1727204455.28651: done extending task lists 34052 1727204455.28653: done processing included files 34052 1727204455.28654: results queue empty 34052 1727204455.28654: checking for any_errors_fatal 34052 1727204455.28659: done checking for any_errors_fatal 34052 1727204455.28659: checking for max_fail_percentage 34052 1727204455.28660: done checking for max_fail_percentage 34052 1727204455.28661: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.28662: done checking to see if all hosts have failed 34052 1727204455.28662: getting the remaining hosts for this loop 34052 1727204455.28663: done getting the remaining hosts for this loop 34052 1727204455.28666: getting the next task for host managed-node1 34052 1727204455.28670: done getting next task for host managed-node1 34052 1727204455.28671: ^ task is: TASK: Check routes and DNS 34052 1727204455.28673: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204455.28675: getting variables 34052 1727204455.28676: in VariableManager get_vars() 34052 1727204455.28691: Calling all_inventory to load vars for managed-node1 34052 1727204455.28693: Calling groups_inventory to load vars for managed-node1 34052 1727204455.28694: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.28701: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.28703: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.28705: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.29630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.30813: done with get_vars() 34052 1727204455.30839: done getting variables 34052 1727204455.30886: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.080) 0:00:41.626 ***** 34052 1727204455.30911: entering _queue_task() for managed-node1/shell 34052 1727204455.31213: worker is 1 (out of 1 available) 34052 1727204455.31228: exiting _queue_task() for managed-node1/shell 34052 1727204455.31242: done queuing things up, now waiting for results queue to drain 34052 1727204455.31244: waiting for pending results... 34052 1727204455.31443: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 34052 1727204455.31525: in run() - task 127b8e07-fff9-66a4-e2a3-00000000075e 34052 1727204455.31543: variable 'ansible_search_path' from source: unknown 34052 1727204455.31547: variable 'ansible_search_path' from source: unknown 34052 1727204455.31586: calling self._execute() 34052 1727204455.31671: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204455.31681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204455.31691: variable 'omit' from source: magic vars 34052 1727204455.32013: variable 'ansible_distribution_major_version' from source: facts 34052 1727204455.32025: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204455.32035: variable 'omit' from source: magic vars 34052 1727204455.32067: variable 'omit' from source: magic vars 34052 1727204455.32096: variable 'omit' from source: magic vars 34052 1727204455.32131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34052 1727204455.32169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34052 1727204455.32188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34052 1727204455.32202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204455.32213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34052 1727204455.32242: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34052 1727204455.32247: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204455.32249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204455.32324: Set connection var ansible_connection to ssh 34052 1727204455.32332: Set connection var ansible_timeout to 10 34052 1727204455.32341: Set connection var ansible_pipelining to False 34052 1727204455.32343: Set connection var ansible_shell_type to sh 34052 1727204455.32351: Set connection var ansible_module_compression to ZIP_DEFLATED 34052 1727204455.32360: Set connection var ansible_shell_executable to /bin/sh 34052 1727204455.32383: variable 'ansible_shell_executable' from source: unknown 34052 1727204455.32386: variable 'ansible_connection' from source: unknown 34052 1727204455.32390: variable 'ansible_module_compression' from source: unknown 34052 1727204455.32393: variable 'ansible_shell_type' from source: unknown 34052 1727204455.32396: variable 'ansible_shell_executable' from source: unknown 34052 1727204455.32398: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204455.32400: variable 'ansible_pipelining' from source: unknown 34052 1727204455.32403: variable 'ansible_timeout' from source: unknown 34052 1727204455.32405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204455.32524: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204455.32536: variable 'omit' from source: magic vars 34052 1727204455.32540: starting attempt loop 34052 1727204455.32544: running the handler 34052 1727204455.32556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34052 1727204455.32578: _low_level_execute_command(): starting 34052 1727204455.32588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34052 1727204455.33155: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204455.33160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34052 1727204455.33164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.33216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.33224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.33229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.33285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.35067: stdout chunk (state=3): >>>/root <<< 34052 1727204455.35164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.35232: stderr chunk (state=3): >>><<< 34052 1727204455.35236: stdout chunk (state=3): >>><<< 34052 1727204455.35256: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204455.35270: _low_level_execute_command(): starting 34052 1727204455.35277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383 `" && echo ansible-tmp-1727204455.3525717-36610-104791775564383="` echo /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383 `" ) && sleep 0' 34052 1727204455.35790: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.35793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found <<< 34052 1727204455.35804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34052 1727204455.35808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204455.35810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.35863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.35871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.35873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.35919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.37956: stdout chunk (state=3): >>>ansible-tmp-1727204455.3525717-36610-104791775564383=/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383 <<< 34052 1727204455.38068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.38135: stderr chunk (state=3): >>><<< 34052 1727204455.38138: stdout chunk (state=3): >>><<< 34052 1727204455.38155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204455.3525717-36610-104791775564383=/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204455.38190: variable 'ansible_module_compression' from source: unknown 34052 1727204455.38240: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34052wq_nnsml/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34052 1727204455.38273: variable 'ansible_facts' from source: unknown 34052 1727204455.38336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py 34052 1727204455.38450: Sending initial data 34052 1727204455.38454: Sent initial data (156 bytes) 34052 1727204455.38969: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.38973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.38976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204455.38978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.39036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.39039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.39044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.39094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.40769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34052 1727204455.40813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34052 1727204455.40866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp7q_v7u1t /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py <<< 34052 1727204455.40871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py" <<< 34052 1727204455.40914: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34052wq_nnsml/tmp7q_v7u1t" to remote "/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py" <<< 34052 1727204455.40919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py" <<< 34052 1727204455.41502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.41585: stderr chunk (state=3): >>><<< 34052 1727204455.41589: stdout chunk (state=3): >>><<< 34052 1727204455.41612: done transferring module to remote 34052 1727204455.41622: _low_level_execute_command(): starting 34052 1727204455.41631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/ /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py && sleep 0' 34052 1727204455.42120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.42124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.42130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.42137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found <<< 34052 1727204455.42139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.42190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.42197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.42199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.42247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.44200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.44263: stderr chunk (state=3): >>><<< 34052 1727204455.44268: stdout chunk (state=3): >>><<< 34052 1727204455.44284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204455.44287: _low_level_execute_command(): starting 34052 1727204455.44293: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/AnsiballZ_command.py && sleep 0' 34052 1727204455.44772: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.44802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found <<< 34052 1727204455.44806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.44809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34052 1727204455.44812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.44872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.44875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.44877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.44944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.63082: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d0:df:0f:c9:4d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.8.176/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2872sec preferred_lft 2872sec\n inet6 fe80::10d0:dfff:fe0f:c94d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.8.176 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.8.176 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:00:55.620017", "end": "2024-09-24 15:00:55.629546", "delta": "0:00:00.009529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34052 1727204455.64813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. <<< 34052 1727204455.64879: stderr chunk (state=3): >>><<< 34052 1727204455.64883: stdout chunk (state=3): >>><<< 34052 1727204455.64902: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d0:df:0f:c9:4d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.8.176/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2872sec preferred_lft 2872sec\n inet6 fe80::10d0:dfff:fe0f:c94d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.8.176 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.8.176 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:00:55.620017", "end": "2024-09-24 15:00:55.629546", "delta": "0:00:00.009529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.8.176 closed. 34052 1727204455.64951: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34052 1727204455.64957: _low_level_execute_command(): starting 34052 1727204455.64963: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204455.3525717-36610-104791775564383/ > /dev/null 2>&1 && sleep 0' 34052 1727204455.65440: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.65444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.65471: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34052 1727204455.65474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34052 1727204455.65529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' <<< 34052 1727204455.65533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34052 1727204455.65543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34052 1727204455.65609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34052 1727204455.67588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34052 1727204455.67648: stderr chunk (state=3): >>><<< 34052 1727204455.67652: stdout chunk (state=3): >>><<< 34052 1727204455.67668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.8.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.8.176 originally 10.31.8.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/45f5f78759' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34052 1727204455.67677: handler run complete 34052 1727204455.67696: Evaluated conditional (False): False 34052 1727204455.67705: attempt loop complete, returning result 34052 1727204455.67708: _execute() done 34052 1727204455.67710: dumping result to json 34052 1727204455.67717: done dumping result, returning 34052 1727204455.67724: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [127b8e07-fff9-66a4-e2a3-00000000075e] 34052 1727204455.67731: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000075e 34052 1727204455.67852: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000075e 34052 1727204455.67855: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009529", "end": "2024-09-24 15:00:55.629546", "rc": 0, "start": "2024-09-24 15:00:55.620017" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:d0:df:0f:c9:4d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.8.176/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2872sec preferred_lft 2872sec inet6 fe80::10d0:dfff:fe0f:c94d/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.8.176 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.8.176 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 34052 1727204455.67935: no more pending results, returning what we have 34052 1727204455.67939: results queue empty 34052 1727204455.67940: checking for any_errors_fatal 34052 1727204455.67942: done checking for any_errors_fatal 34052 1727204455.67943: checking for max_fail_percentage 34052 1727204455.67944: done checking for max_fail_percentage 34052 1727204455.67946: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.67947: done checking to see if all hosts have failed 34052 1727204455.67948: getting the remaining hosts for this loop 34052 1727204455.67949: done getting the remaining hosts for this loop 34052 1727204455.67953: getting the next task for host managed-node1 34052 1727204455.67959: done getting next task for host managed-node1 34052 1727204455.67961: ^ task is: TASK: Verify DNS and network connectivity 34052 1727204455.67971: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34052 1727204455.67975: getting variables 34052 1727204455.67976: in VariableManager get_vars() 34052 1727204455.68018: Calling all_inventory to load vars for managed-node1 34052 1727204455.68021: Calling groups_inventory to load vars for managed-node1 34052 1727204455.68023: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.68035: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.68038: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.68041: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.69175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.70364: done with get_vars() 34052 1727204455.70394: done getting variables 34052 1727204455.70445: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.395) 0:00:42.021 ***** 34052 1727204455.70474: entering _queue_task() for managed-node1/shell 34052 1727204455.70769: worker is 1 (out of 1 available) 34052 1727204455.70786: exiting _queue_task() for managed-node1/shell 34052 1727204455.70801: done queuing things up, now waiting for results queue to drain 34052 1727204455.70803: waiting for pending results... 34052 1727204455.71008: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 34052 1727204455.71089: in run() - task 127b8e07-fff9-66a4-e2a3-00000000075f 34052 1727204455.71103: variable 'ansible_search_path' from source: unknown 34052 1727204455.71107: variable 'ansible_search_path' from source: unknown 34052 1727204455.71142: calling self._execute() 34052 1727204455.71237: variable 'ansible_host' from source: host vars for 'managed-node1' 34052 1727204455.71243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34052 1727204455.71252: variable 'omit' from source: magic vars 34052 1727204455.71570: variable 'ansible_distribution_major_version' from source: facts 34052 1727204455.71584: Evaluated conditional (ansible_distribution_major_version != '6'): True 34052 1727204455.71703: variable 'ansible_facts' from source: unknown 34052 1727204455.72287: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 34052 1727204455.72291: when evaluation is False, skipping this task 34052 1727204455.72294: _execute() done 34052 1727204455.72297: dumping result to json 34052 1727204455.72299: done dumping result, returning 34052 1727204455.72305: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [127b8e07-fff9-66a4-e2a3-00000000075f] 34052 1727204455.72310: sending task result for task 127b8e07-fff9-66a4-e2a3-00000000075f 34052 1727204455.72414: done sending task result for task 127b8e07-fff9-66a4-e2a3-00000000075f 34052 1727204455.72417: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 34052 1727204455.72472: no more pending results, returning what we have 34052 1727204455.72476: results queue empty 34052 1727204455.72477: checking for any_errors_fatal 34052 1727204455.72491: done checking for any_errors_fatal 34052 1727204455.72492: checking for max_fail_percentage 34052 1727204455.72494: done checking for max_fail_percentage 34052 1727204455.72495: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.72496: done checking to see if all hosts have failed 34052 1727204455.72496: getting the remaining hosts for this loop 34052 1727204455.72498: done getting the remaining hosts for this loop 34052 1727204455.72503: getting the next task for host managed-node1 34052 1727204455.72511: done getting next task for host managed-node1 34052 1727204455.72514: ^ task is: TASK: meta (flush_handlers) 34052 1727204455.72516: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204455.72522: getting variables 34052 1727204455.72523: in VariableManager get_vars() 34052 1727204455.72581: Calling all_inventory to load vars for managed-node1 34052 1727204455.72584: Calling groups_inventory to load vars for managed-node1 34052 1727204455.72586: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.72599: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.72602: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.72605: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.73633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.74835: done with get_vars() 34052 1727204455.74863: done getting variables 34052 1727204455.74923: in VariableManager get_vars() 34052 1727204455.74937: Calling all_inventory to load vars for managed-node1 34052 1727204455.74939: Calling groups_inventory to load vars for managed-node1 34052 1727204455.74940: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.74945: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.74946: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.74948: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.75901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.77086: done with get_vars() 34052 1727204455.77121: done queuing things up, now waiting for results queue to drain 34052 1727204455.77123: results queue empty 34052 1727204455.77123: checking for any_errors_fatal 34052 1727204455.77126: done checking for any_errors_fatal 34052 1727204455.77127: checking for max_fail_percentage 34052 1727204455.77128: done checking for max_fail_percentage 34052 1727204455.77128: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.77129: done checking to see if all hosts have failed 34052 1727204455.77129: getting the remaining hosts for this loop 34052 1727204455.77130: done getting the remaining hosts for this loop 34052 1727204455.77132: getting the next task for host managed-node1 34052 1727204455.77135: done getting next task for host managed-node1 34052 1727204455.77137: ^ task is: TASK: meta (flush_handlers) 34052 1727204455.77138: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204455.77140: getting variables 34052 1727204455.77141: in VariableManager get_vars() 34052 1727204455.77151: Calling all_inventory to load vars for managed-node1 34052 1727204455.77153: Calling groups_inventory to load vars for managed-node1 34052 1727204455.77154: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.77160: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.77161: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.77163: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.78074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.79252: done with get_vars() 34052 1727204455.79282: done getting variables 34052 1727204455.79324: in VariableManager get_vars() 34052 1727204455.79338: Calling all_inventory to load vars for managed-node1 34052 1727204455.79340: Calling groups_inventory to load vars for managed-node1 34052 1727204455.79342: Calling all_plugins_inventory to load vars for managed-node1 34052 1727204455.79346: Calling all_plugins_play to load vars for managed-node1 34052 1727204455.79348: Calling groups_plugins_inventory to load vars for managed-node1 34052 1727204455.79351: Calling groups_plugins_play to load vars for managed-node1 34052 1727204455.80224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34052 1727204455.81431: done with get_vars() 34052 1727204455.81463: done queuing things up, now waiting for results queue to drain 34052 1727204455.81464: results queue empty 34052 1727204455.81467: checking for any_errors_fatal 34052 1727204455.81468: done checking for any_errors_fatal 34052 1727204455.81468: checking for max_fail_percentage 34052 1727204455.81469: done checking for max_fail_percentage 34052 1727204455.81470: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.81471: done checking to see if all hosts have failed 34052 1727204455.81471: getting the remaining hosts for this loop 34052 1727204455.81472: done getting the remaining hosts for this loop 34052 1727204455.81480: getting the next task for host managed-node1 34052 1727204455.81483: done getting next task for host managed-node1 34052 1727204455.81483: ^ task is: None 34052 1727204455.81485: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34052 1727204455.81485: done queuing things up, now waiting for results queue to drain 34052 1727204455.81486: results queue empty 34052 1727204455.81486: checking for any_errors_fatal 34052 1727204455.81487: done checking for any_errors_fatal 34052 1727204455.81487: checking for max_fail_percentage 34052 1727204455.81488: done checking for max_fail_percentage 34052 1727204455.81489: checking to see if all hosts have failed and the running result is not ok 34052 1727204455.81489: done checking to see if all hosts have failed 34052 1727204455.81491: getting the next task for host managed-node1 34052 1727204455.81493: done getting next task for host managed-node1 34052 1727204455.81493: ^ task is: None 34052 1727204455.81494: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.110) 0:00:42.132 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.91s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.57s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Configure networking connection profiles --- 2.55s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.74s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.72s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Ensure ping6 command is present ----------------------------------------- 1.68s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Install iproute --------------------------------------------------------- 1.63s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.62s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.61s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface veth0 --------------------------------------------- 1.40s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.12s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.89s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.82s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.73s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.72s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.71s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather current interface info ------------------------------------------- 0.66s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Get NM profile info ----------------------------------------------------- 0.60s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Stat profile file ------------------------------------------------------- 0.57s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 34052 1727204455.81599: RUNNING CLEANUP